Non US site pages indexed in US Google search
-
Hi,
We are having a global site wide issue with non US site pages being indexed by Google and served up in US search results. Conversley, we have US en pages showing in the Japan Google search results.
We currently us IP detect to direct users to the correct regional site but it isn't effective if the users are entering through an incorrect regional page. At the top of each or our pages we have a drop down menu to allow users to manually select their preferred region. Is it possible that Google Bot is crawling these links and indexing these other regional pages as US and not detecting it due to our URL structure?
Below are examples of two of our URLs for reference - one from Canada, the other from the US
/ca/en/prod4130078/2500058/catalog50008/
/us/en/prod4130078/2500058/catalog20038/
If that is, in fact, what is happening, would setting the links within the drop down to 'no follow' address the problem?
Thank you.
Angie
-
John,
Thanks for adding all of these great suggestions - I don't do international that often so the full list of methods isn't always in my conscious awareness!
-
Here's all the things you can do to try geotarget your content for the search bots:
- Register each subfolder as a separate site in Google Webmaster Tools (e.g. example.com/ca/, example.com/us/), and geotarget it (see here).
- Set meta tags or http headers on each page to let Bing know the language and country (see here).
- For duplicate or near-duplicate pages across different English speaking localities, you can try out the hreflang tags to clue Google in that they're the same page, but geotargeting users in different locations. I haven't personally implemented this myself, so I can't speak to how well it works, but you can find more info about it hereand here.
Setting nofollows just stops PageRank from flowing, but bots can still follow these links, so I wouldn't do that.
-
Its absolutely possible that's what's happening. You cannot rely on Google's system being barred from crawling anything on your site, no matter how well you code it. Even if you blocked the URL with nofollow, it would not stop the bot.
Another factor is if all your content is in English (as your URL structure suggests it is). Google does a terrible job of discerning separation of international content when all the content is in the same language, on the same root domain.
Proper separation in a way Google can't confuse is vital. Since I expect you do not intend to change the language across sites, your best action would be to migrate international content to a completely different domain. At the very least you can then use GWT to inform Google that "this domain is for this country", however if you want to be even better off, you'd host that other content on a server in that country.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Getting accurate Geo Location traffic stats in Google Analytics - HELP
One of our clients services the US and the UK, but having looked at the report over an extended period of time we can still see that the vast majority of traffic is coming from the US. I.e. our last report for March indicated that there were over 3,000 users in the US but only 6 in the UK. We know that Google Analytics works out a user’s location based on where their IP is located and not their physical location, and that this means that the data needs to be taken with a pinch of salt as it won’t always represent what you expect. That being said, we know that the traffic figures for Europe are largely inaccurate and would like to get some more accurate stats to report on. Is there a way to do so at all within Google Analytics?
International SEO | | Wagada1 -
International SEO Two Subdomains Showing Up in Google Search Results
Hi I have a client that is having two subdomains showing up SERP when you Google their name. Here are the details. They have two subdomains us.companyname.com and en.companyname.com us.companyname.com is for the US and has completely different products and content than en.companyname.com en.companyname.com is the site designed for Europe and it is in English. How can I make it so that only the us. version shows up in the search results? Thanks in advance!
International SEO | | JohnWeb120 -
How well does Google's "Locale-aware crawling by Googlebot" work?
Hello, In January of this year Google introduced "Locale-aware crawling by Googlebot." https://support.google.com/webmasters/answer/6144055?hl=e Google uses different crawl settings for sites that cannot have separate URLs for each locale. ......... This is basically for sites that dynamically render contend on the same URL depending on the locale and language (IP) of the visitor. If e.g. a visitor was coming from France, the targeted page would load in french. If a visitor was coming from the US the same page would load in English on the same URL. Does anyone have any experience with this setup and how well it works? How well do the different versions of a page get indexed, and how well do those pages rank? In the example above, does the french content get indexed correctly? Many thanks!
International SEO | | Veva0 -
Other country TLD's for US product
We have a product ( Example: Car ) where all of the TLD's for North America (Example: Car.com, Car.net, etc) have been taken. I've found several for TLD's like .IT, .LA, .AG, etc. If I purchased those and launched sites under those TLD's in the US on servers here in the US and marketed the same as a North American TLD, do you see any issues with this regarding SEO challenges? Thanks All! Hugs, Natalie 🙂
International SEO | | okiedokie0 -
Multilingual Ecommerce Product Pages Best Practices
Hi Mozzers, We have a marketplace with 20k+ products, most of which are written in English. At the same time we support several different languages. This changes the chrome of the site (nav, footer, help text, buttons, everything we control) but leaves all the products in their original language. This resulted in all kinds of duplicate content (pages, titles, descriptions) being detected by SEOMoz and GWT. After doing some research we implemented the on page rel="alternate" hreflang="x", seeing as our situation almost perfectly matched the first use case listed by Google on this page http://support.google.com/webmasters/bin/answer.py?hl=en&answer=189077. This ended up not helping at all. Google still reports duplicate titles and descriptions for thousands of products, months after setting this up. We are thinking about changing to the sitemap implementation rel="alternate" hreflang="X", but are not sure if this will work either. Other options we have considered include noindex or blocks with robots.txt when the product language is not the same as the site language. That way the feature is still open to users while removing the duplicate pages for Google. So I'm asking for input on best practice for getting Google to correctly recognize one product, with 6 different language views of that same product. Can anyone help? Examples: (Site in English, Product in English) http://website.com/products/product-72 (Site in Spanish, Product in English) http://website.com/es/products/product-72 (Site in German, Product in English) http://website.com/de/products/product-72 etc...
International SEO | | sedwards0 -
Moving British site to the US... who will have .com? US or UK?
We are the UK's first baby social commerce site launched in Nov 2011. We're doing quite well and are looking at expanding to the US. However I'm not sure what advice you'd give me in terms of internationalising the site. I see three options on how to deal with the URL structure? Make US site as .com as it will be my main source of revenue for the long run and redirect all British traffic to .co.uk Have .com for both UK and US but have the URL as either: us.babyhuddle.com or as babyhuddle.com/us/. Same thing for the UK Another option? Would love to hear the feedback from you guys. Thanks, Walid
International SEO | | walidalsaqqaf0 -
Keyword Difficulty on Local Searches
I have got a site targeted for a New Zealand audience. The site is about property in Australia. The SERP for the keyword "real estate australia" is dominated by .com.au domains which are obviously set for Australia. Does google give .co.nz domains priority in the SERPs for New Zealand or are .com.au and .co.nz domains treated evenly for New Zealand based searches? http://www.google.co.nz/search?q=real+estate+australia&pws=0&gl=NZ Would a .co.nz domain have higher priority in this SERP?
International SEO | | OnPage10 -
Geo Targeting for Similar Sites to Specific Countries in Google's Index
I was hoping Webmaster Tools geo targeting would prevent this - I'm seeing in select google searches several pages indexed from our Australian website. Both sites have unique TLDs: barraguard.com barraguard.com.au I've attached a screenshot as an example. The sites are both hosted here in the U.S. at our data center. Are there any other methods for preventing Google and other search engines from indexing the barraguard.com.au pages in searches that take place in the U.S.? dSzoh.jpg
International SEO | | longbeachjamie0