Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
What countries does Google crawl from? Is it only US or do they crawl from Europe and Asia, etc.?
-
Where does Google crawl the web from? Is it in the US only, or do they do it from a European base too? The reason for asking is for GeoIP redirection. For example, if a website is using GeoIP redirection to redirect all US traffic to a .com site and all EU traffic to a .co.uk site, will Google ever see the .co.uk site?
-
Hi Keith,
In my experience Google mainly crawls from the US.
You're quite right, GeoIP redirection can cause major issues with indexation - as if you're redirecting everything from the US to the .com googlebot can't see the .co.uk site.
As such, I'm not a fan. Rather than implementing a hard redirect I prefer Amazon's approach. If you visit the amazon.com site from a UK IP you get a javascript overlay that invites you to visit the .co.uk version of the site instead - they let the user decide which site to view rather than actually redirecting them.
This is a nice solution as it ensures that the search bots can crawl both versions of the site, and rankings aren't endangered.
I hope this helps,
Hannah
-
Keith, I am having the same issue and I agree with you. The fact that Google has data centers in Europe does not necessarily mean the algos are indexing from there. I also want to set Europe and US GeoIP redirection. It would be great to get Mozers opinions on this. Hopefully this post gets freshly reviewed
-
Interesting question - I'd quite like to know what happens here too.
Matt Cutts recently posted a video on cloaking (http://youtu.be/QHtnfOgp65Q) saying that as long as you don't do anything 'special' for Googlebot you're OK, but presumably if you are redirecting IP's based on location and you don't want to prevent Googlebot form accessing your site then effectively you have to do something 'special' for Googlebot (e.g. you're doing one thing for everyone else and a different thing for Googlebot).
-
Hi,
The data center locations is interesting, but it isn't what I was looking for. I need to know whether Google crawls the web from any IP other than US IPs.
To clear up the second question, let me be more specific:
Let's say Google is crawling a .co.uk site from a US IP address. The site is using GEO IP redirection to redirect all US traffic to the .com site. Therefore, when Google attempts to crawl the .co.uk from the US IP address site it will be redirected to the .com site, never seeing the .co.uk site. Can anyone confirm that this is what happens?
-
I found an article from 2008 that shows Google data centre locations: http://bit.ly/mONhf9
Your other question is a bit confusing. Why Google wouldn't see the UK site?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Advise on the right way to block country specific users but not block Googlebot - and not be seen to be cloaking. Help please!
Hi, I am working on the SEO of an online gaming platform - a platform that can only be accessed by people in certain countries, where the games and content are legally allowed.
International SEO | | MarkCanning
Example: The games are not allowed in the USA, but they are allowed in Canada. Present Situation:
Presently when a user from the USA visits the site they get directed to a restricted location page with the following message: RESTRICTED LOCATION
Due to licensing restrictions, we can't currently offer our services in your location. We're working hard to expand our reach, so stay tuned for updates! Because USA visitors are blocked Google which primarily (but not always) crawls from the USA is also blocked, so the company webpages are not being crawled and indexed. Objective / What we want to achieve: The website will have multiple region and language locations. Some of these will exist as standalone websites and others will exist as folders on the domain. Examples below:
domain.com/en-ca [English Canada]
domain.com/fr-ca [french Canada]
domain.com/es-mx [spanish mexico]
domain.com/pt-br [portugese brazil]
domain.co.in/hi [hindi India] If a user from USA or another restricted location tries to access our site they should not have access but should get a restricted access message.
However we still want google to be able to access, crawl and index our pages. Can i suggest how do we do this without getting done for cloaking etc? Would this approach be ok? (please see below) We continue to work as the present situation is presently doing, showing visitors from the USA a restricted message.
However rather than redirecting these visitors to a restricted location page, we just black out the page and show them a floating message as if it were a model window.
While Googlebot would be allowed to visit and crawl the website. I have also read that it would be good to put paywall schema on each webpage to let Google know that we are not cloaking and its a restricted paid page. All public pages are accessible but only if the visitor is from a location that is not restricted Any feedback and direction that can be given would be greatly appreciated as i am new to this angle of SEO. Sincere thanks,0 -
International SEO - Targeting US and UK markets
Hi folks, i have a client who is based in italy and they set up a site that sells travel experiences in the sout of Italy (the site currently sit on a server in Italy). The site has been set up as gTLDs: www.example.com They only want to target the US and the UK market to promote their travel experiences and the site has only the english version (the site does not currently offer an italian version). If they decide to go for the gTLDs and not actually change to a ccTLDs (which would be ideal from my point of view) how are the steps to be taken to set this up correctly on GSC? They currently only have one property registered on GSC: www.exapmple.com therefore i guess the next steps are: Add new property - www.example.com/uk and and set up geo targeting for UK Existing property - www.example.com/ set up geo targeting for US In case the client does not have the budget to optimise the content for american and british languages, would still make sense to have 2 separate property in GSC (example.com for US market and example.com/uk for UK market)? Few considerations: Add canonical tag to avoid duplicate content across the two versions of the site (in the event there is no budget to optimise the content for US and UK market)? Thank you all in advance for looking into this David
International SEO | | Davide19840 -
Worldwide and Europe hreflang implementation.
Hi Moz ! We're having quite a discussion here and I'd like to have some inputs. Let me explain the situation and what we plan to do so far. One of our client has two separate markets : World and Europe. Both pages versions will be mostly the same, except for the fact that they will have their own products. So basically, we'd want to show only the European EN version to Europe and the standard EN version to the rest of the world, same goes for FR and ES. As far as IT, DE, CS and SK, they will only be present within the european version. Since we cannot target all Europe with a single hreflang tag, we might have to do it for every single european countries. Regarding this subject, SMX Munich recently had quite an interesting session about this topic with a confirmation coming from John Mueller saying that we can target a single URL more than once with different hreflang tags. You can read more here : http://www.rebelytics.com/multiple-hreflang-tags-one-url/ So having all this in mind, here's the implementation we plan to do : www.example.com/en/ Self canonical www.example.com/fr/ - hreflang = fr www.example.com/es/ - hreflang = es www.example.eu/it/ - hreflang = it www.example.eu/de/ - hreflang = de www.example.eu/cs/ - hreflang = cs www.example.eu/sk/ - hreflang = sk www.example.eu/fr/ - hreflang = be-fr www.example.eu/fr/ - hreflang = ch-fr www.example.eu/fr/ - hreflang = cz-fr www.example.eu/fr/ - hreflang = de-fr www.example.eu/fr/ - hreflang = es-fr www.example.eu/fr/ - hreflang = fr-fr www.example.eu/fr/ - hreflang = uk-fr www.example.eu/fr/ - hreflang = gr-fr www.example.eu/fr/ - hreflang = hr-fr etc… . This will be done for all european countries (FR, EN and ES). www.example.com/en/ - x-default Let me know what you guys think. Thanks!
International SEO | | Netleaf.ca0 -
Has any one seen negative SEO effects from using Google Translate API
We have a site currently in development that is using the Google Translate API and I am having a massive issue getting screaming frog to crawl and all of our non-native English speaking employees have read through the translated copy in their native language and the general consensus is it reads at a 5th grade level at best. My questions to the community is, has anyone implemented this API on a site and has it a) helped with gaining traffic from other languages/countires and b) has it hurt there site from an SEO standpoint.
International SEO | | VERBInteractive0 -
Google does not index UK version of our site, and serves US version instead. Do I need to remove hreflanguage for US?
Webmaster tools indicates that only 25% of pages on our UK domain with GBP prices is indexed.
International SEO | | lcourse
We have another US domain with identical content but USD prices which is indexed fine. When I search in google for site:mydomain I see that most of my pages seem to appear, but then in the rich snippets google shows USD prices instead of the GBP prices which we publish on this page (USD price is not published on the page and I tested with an US proxy and US price is nowhere in the source code). Then I clicked on the result in google to see cached version of page and google shows me as cached version of the UK product page the US product page. I use the following hreflang code: rel="alternate" hreflang="en-US" href="https://www.domain.com/product" />
rel="alternate" hreflang="en-GB" href="https://www.domain.co.uk/product" /> canonical of UK page is correctly referring to UK page. Any ideas? Do I need to remove the hreflang for en-US to get the UK domain properly indexed in google?0 -
Poor Google.co.uk ranking for a UK based .net, but great Google.com
I run an extremely popular news & community website at http://www.onedirection.net, but we're having a few ranking issues in Google.co.uk. The site gets most of its traffic from the USA which isnt a bad thing - but for our key term "one direction", we currently don't rank at all on Google.co.uk. The site is located on a server based in Manchester, UK, and we used to rank very well earlier this year - fluttering about in position 5-7 most of the time. However earlier this year, around July, we started to fall down to page 2 or 3, and at the start of this month we don't rank at all for "one direction" on Google.co.uk. On Google.com however we're very strong, always on page one. We're definitely indexed on .co.uk, just not for main search term - which I find a bit frustrating. All the content on our site is unique, and we write 2-4 stories every day. We have an active forum too, so a lot of our content is user-generated. We've never had any "unnatural link building" messages in Webmaster Tools, and our link profile looks fine to me. Do we just need more .co.uk links, or are we being penalised for something? (I can't imagine what though). It certainly seems that way though. Another site, "www.onedirection.co.uk" which is never updated and has a blatant ad for something completely unrelated on its homepage, ranks above us at the moment- which I find quite frankly appalling as our site is pretty much regarded as the worlds most popular One Direction news and fan site. We've spent the last few months improving the page-load times of our site, and we've reduced any unneccesary internal linking on the site. Approx 2 months ago we launched a new forum on the site, 301'ing all the old forum links to the new one, so that could have had an impact on rankings - but we'd expect to see an impact on Google.com as well if this was an issue. We definitely feel that we should be ranking higher on Google.co.uk. Does anyone have any ideas what the iproblems could be? Cheers, Chris.
International SEO | | PixelKicks0 -
Non US site pages indexed in US Google search
Hi, We are having a global site wide issue with non US site pages being indexed by Google and served up in US search results. Conversley, we have US en pages showing in the Japan Google search results. We currently us IP detect to direct users to the correct regional site but it isn't effective if the users are entering through an incorrect regional page. At the top of each or our pages we have a drop down menu to allow users to manually select their preferred region. Is it possible that Google Bot is crawling these links and indexing these other regional pages as US and not detecting it due to our URL structure? Below are examples of two of our URLs for reference - one from Canada, the other from the US /ca/en/prod4130078/2500058/catalog50008/ /us/en/prod4130078/2500058/catalog20038/ If that is, in fact, what is happening, would setting the links within the drop down to 'no follow' address the problem? Thank you. Angie
International SEO | | Corel0 -
Google Webmaster Tools - International SEO Geo-Targeting site with Worldwide rankings
I have a client who already has rankings in the US & internationally. The site is broken down like this: url.com (main site with USA & International Rankings) url.com/de url.com/de-english url.com/ng url.com/au url.com/ch url.com/ch-french url.com/etc Each folder has it's own sitmap & relative content for it's respective country. I am reading in google webmaster tools > site config > settings, the option under 'Learn More': "If you don't want your site associated with any location, select Unlisted." If I want to keep my client's international rankings the way it currently is on url.com, do NOT geo target to United States? So I select unlisted, right? Would I use geo targeting on the url.com/de, url.com/de-english, url.com/ng, url.com/au and so on?
International SEO | | Francisco_Meza0