Non US site pages indexed in US Google search
-
Hi,
We are having a global site wide issue with non US site pages being indexed by Google and served up in US search results. Conversley, we have US en pages showing in the Japan Google search results.
We currently us IP detect to direct users to the correct regional site but it isn't effective if the users are entering through an incorrect regional page. At the top of each or our pages we have a drop down menu to allow users to manually select their preferred region. Is it possible that Google Bot is crawling these links and indexing these other regional pages as US and not detecting it due to our URL structure?
Below are examples of two of our URLs for reference - one from Canada, the other from the US
/ca/en/prod4130078/2500058/catalog50008/
/us/en/prod4130078/2500058/catalog20038/
If that is, in fact, what is happening, would setting the links within the drop down to 'no follow' address the problem?
Thank you.
Angie
-
John,
Thanks for adding all of these great suggestions - I don't do international that often so the full list of methods isn't always in my conscious awareness!
-
Here's all the things you can do to try geotarget your content for the search bots:
- Register each subfolder as a separate site in Google Webmaster Tools (e.g. example.com/ca/, example.com/us/), and geotarget it (see here).
- Set meta tags or http headers on each page to let Bing know the language and country (see here).
- For duplicate or near-duplicate pages across different English speaking localities, you can try out the hreflang tags to clue Google in that they're the same page, but geotargeting users in different locations. I haven't personally implemented this myself, so I can't speak to how well it works, but you can find more info about it hereand here.
Setting nofollows just stops PageRank from flowing, but bots can still follow these links, so I wouldn't do that.
-
Its absolutely possible that's what's happening. You cannot rely on Google's system being barred from crawling anything on your site, no matter how well you code it. Even if you blocked the URL with nofollow, it would not stop the bot.
Another factor is if all your content is in English (as your URL structure suggests it is). Google does a terrible job of discerning separation of international content when all the content is in the same language, on the same root domain.
Proper separation in a way Google can't confuse is vital. Since I expect you do not intend to change the language across sites, your best action would be to migrate international content to a completely different domain. At the very least you can then use GWT to inform Google that "this domain is for this country", however if you want to be even better off, you'd host that other content on a server in that country.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
International Sites and Duplicate Content
Hello, I am working on a project where have some doubts regarding the structure of international sites and multi languages.Website is in the fashion industry. I think is a common problem for this industry. Website is translated in 5 languages and sell in 21 countries. As you can imagine this create a huge number of urls, so much that with ScreamingFrog I cant even complete the crawling. Perhaps the UK site is visible in all those versions http://www.MyDomain.com/en/GB/ http://www.MyDomain.com/it/GB/ http://www.MyDomain.com/fr/GB/ http://www.MyDomain.com/de/GB/ http://www.MyDomain.com/es/GB/ Obviously for SEO only the first version is important One other example, the French site is available in 5 languages and again... http://www.MyDomain.com/fr/FR/ http://www.MyDomain.com/en/FR/ http://www.MyDomain.com/it/FR/ http://www.MyDomain.com/de/FR/ http://www.MyDomain.com/es/FR/ And so on...this is creating 3 issues mainly: Endless crawling - with crawlers not focusing on most important pages Duplication of content Wrong GEO urls ranking in Google I have already implemented href lang but didn't noticed any improvements. Therefore my question is Should I exclude with "robots.txt" and "no index" the non appropriate targeting? Perhaps for UK leave crawable just English version i.e. http://www.MyDomain.com/en/GB/, for France just the French version http://www.MyDomain.com/fr/FR/ and so on What I would like to get doing this is to have the crawlers more focused on the important SEO pages, avoid content duplication and wrong urls rankings on local Google Please comment
International SEO | | guidoampollini0 -
US traffic falsely inflating traffic figures and bounce rate.
Hi fellow Mozzers! We're handling the digital marketing for a UK-based franchise of a Canadian SaaS company, and I've noticed that a large proportion of their traffic has been coming from the US (not the majority, but enough to skew the figures). The Canadian arm of the business deals with the US market, but the majority, if not all, is direct traffic which seems to suggest they've seen the web address somewhere (not sure where though). Is there a search-friendly way to move this traffic back to the Canadian site? I know I can set up a filter for US traffic so it stops distorting the stats we're seeing (which I have now done), but my worry is this is causing a high bounce rate that may be impacting Google's perception of the site quality. The traffic has a 100% bounce rate (not surprisingly), so if we could find a best practice way of sending them to the Canadian site, that would be great. My first thought was a screen that appears for US traffic prompting them to the Canadian site, but presumably this would still count as a bounce as they're only on one page? Any help much appreciated! Cheers guys,
International SEO | | themegroup
Nick0 -
Human Translation versus Google Translate for Ecommerce Products
Hi all, We want to put our products on our ecommerce site into another language. I have always been under the impression that running text through Google Translate is a no no, not only for the user experience, but also it is a Google tool and I am assuming that Google would notice that it is not translated by a human. I don't know if it would incur a penalty as such but it most likely would not be favoured as a human translation Can anyone confirm their experience or impression on this? Thanks!
International SEO | | bjs20100 -
When searching for example.com, only example.co.uk is showing up. Why?
Hi there, I have a quick question, when looking for our client's domain name in Google (Google.it, Google.co.uk and Google.com), we search for example.com, but the first (and only) search result showing is example.co.uk (which redirects then to example.com) Why is the co.uk domain showing instead of the com domain where we are redirecting to? I don't assume that this is any form of penalisation? Thanks!
International SEO | | Gabriele_Layoutweb0 -
Why would a site lose rankings in U.S while maintaining rankings in other English locations (Canada & Australia)
What would cause a site to lose ranking in the U.S while maintaining top (1st page) positions in other English results countries such as Canada or Australia? Is this purely penguin related because of location of backlinks or are there other significant factors that could be in play? Would this rule out Panda as a cause because it's simply an "English language" targeted algo and not location dependent like backlinks (penguin)? Appreciate any insights
International SEO | | ResumeGenius0 -
Ranking UK company in Google.com
Hi all, I have a UK client with a .com domain, hosted on a US server, but the physical business premises is based in the UK. Their product is a really great product and available for export to the US. I want to rank them higher in the US, more specifically Google.com. I've helped them rank very well organically in the UK (google.co.uk) for some great terms, however they rank almost nowhere in google.com (gl=us) for the same terms, for example: In Google.co.uk they rank #3 for the key-phrase.
International SEO | | seowoody
In Google.com they rank #90 for the same key-phrase. I've got them some great US focused links with PR coverage including MSN Cars, nydailynews.com etc. I just wondered if there was any one "golden ticket" for boosting US rankings? I've read that a physical business premises located in the US helps a lot. Can anyone confirm this and if so, would a rented PO box in the US help? The site has great social signals too, growing twitter following and many FB likes/shares etc. Any other tips/advice? Thanks in advance,
Cheers,
Woody 🙂0 -
How to optimise you site in other countries eg Australia
We would like to rank better for specific keywords in Australia. We rank pretty well in our home tld .co.uk but would like to do so in .com.au I would appreciate your thoughts and recommendations.
International SEO | | seanmccauley0 -
Www.google.rs for SERBIA is mising in Engienes
Why you dont have SERBIA in engines for google - www.google.rs You have Croatia, Bosnia and Hercegovina and Slovenia and Serbia is missing 😞 Best regards kasa
International SEO | | stamparija2