Non US site pages indexed in US Google search
-
Hi,
We are having a global site wide issue with non US site pages being indexed by Google and served up in US search results. Conversley, we have US en pages showing in the Japan Google search results.
We currently us IP detect to direct users to the correct regional site but it isn't effective if the users are entering through an incorrect regional page. At the top of each or our pages we have a drop down menu to allow users to manually select their preferred region. Is it possible that Google Bot is crawling these links and indexing these other regional pages as US and not detecting it due to our URL structure?
Below are examples of two of our URLs for reference - one from Canada, the other from the US
/ca/en/prod4130078/2500058/catalog50008/
/us/en/prod4130078/2500058/catalog20038/
If that is, in fact, what is happening, would setting the links within the drop down to 'no follow' address the problem?
Thank you.
Angie
-
John,
Thanks for adding all of these great suggestions - I don't do international that often so the full list of methods isn't always in my conscious awareness!
-
Here's all the things you can do to try geotarget your content for the search bots:
- Register each subfolder as a separate site in Google Webmaster Tools (e.g. example.com/ca/, example.com/us/), and geotarget it (see here).
- Set meta tags or http headers on each page to let Bing know the language and country (see here).
- For duplicate or near-duplicate pages across different English speaking localities, you can try out the hreflang tags to clue Google in that they're the same page, but geotargeting users in different locations. I haven't personally implemented this myself, so I can't speak to how well it works, but you can find more info about it hereand here.
Setting nofollows just stops PageRank from flowing, but bots can still follow these links, so I wouldn't do that.
-
Its absolutely possible that's what's happening. You cannot rely on Google's system being barred from crawling anything on your site, no matter how well you code it. Even if you blocked the URL with nofollow, it would not stop the bot.
Another factor is if all your content is in English (as your URL structure suggests it is). Google does a terrible job of discerning separation of international content when all the content is in the same language, on the same root domain.
Proper separation in a way Google can't confuse is vital. Since I expect you do not intend to change the language across sites, your best action would be to migrate international content to a completely different domain. At the very least you can then use GWT to inform Google that "this domain is for this country", however if you want to be even better off, you'd host that other content on a server in that country.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Problem to get multilingual posts indexed on Google
Last year on June I decided to make my site multi-lingual. The domain is: https://www.dailyblogprofits.com/ The main language English and I added Portuguese and a few posts on Spanish. What happened since then? I started losing traffic from Google and posts on Portuguese are not being indexed. I use WPML plugin to make it multi-lingual and I had Yoast installed. This week I uninstalled Yoast and when I type on google "site:site:dailyblogprofits.com/pt-br" I started seeing Google indexing images, but still not the missing posts. I have around 145 posts on Portuguese, but on Search Console it show only 57 hreflang tags. Any idea what is the problem? I'm willing to pay for an SEO Expert to resolve this problem to me.
International SEO | | Cleber0090 -
Redirect to 'default' or English (/en) version of site?
Hi Moz Community! I'm trying to work through a thorny internationalization issue with the 'default' and English versions of our site. We have an international set-up of: www.domain.com (in english) www.domain.com/en www.domain.com/en-gb www.domain.com/fr-fr www.domain.com/de-de and so on... All the canonicals and HREFLANGs are set up, except the English language version is giving me pause. If you visit www.domain.com, all of the internal links on that page (due to the current way our cms works) point to www.domain.com/en/ versions of the pages. Content is identical between the two versions. The canonical on, say, www.domain.com/en/products points to www.domain.com/products. Feels like we're pulling in two different directions with our internationalization signals. Links go one way, canonical goes another. Three options I can see: Remove the /en/ version of the site. 301 all the /en versions of pages to /. Update the hreflangs to point the EN language users to the / version. **Redirect the / version of the site to /en. **The reverse of the above. **Keep both the /en and the / versions, update the links on / version. **Make it so that visitors to the / version of the site follow links that don't take them to the /en site. It feels like the /en version of the site is redundant and potentially sending confusing signals to search engines (it's currently a bit of a toss-up as to which version of a page ranks). I'm leaning toward removing the /en version and redirecting to the / version. It would be a big step as currently - due to the internal linking - about 40% of our traffic goes through the /en path. Anything to be aware of? Any recommendations or advice would be much appreciated.
International SEO | | MaxSydenham0 -
Geolocation issue: Google not displaying the correct url in the SERP's
Hello, Im running a multi-country domain with this structure: domain.com/ar/
International SEO | | EstebanCervi
domain.com/mx/
domain.com/cl/
etc I also have: domain.com/int/ for x-default
domain.com/category/ does a 301 redirect through IP geo-location to the correspondent url, example if your IP is from Mexico, then you got redirected to domain.com/mx/category/ hreflang is correct. webmaster tool geo-location is correct. Example of the issue Im facing right now: When users from Chile do a keyword search in Google Chile, the domain ranks well but the URL that appears in the SERP is the /mx/ version, or the /int/ version or any other country version. Other times is the /cl/ version. The same happens for all the users / countries / keywords. I need to understand what Im doing wrong, because Google is not displaying in the SERP's the correct URL version for the country of the user who is doing the search. Thank you so much! I will appreciate your ideas. PS: I think I should try to change the 301 to a 302 redirect, or completely remove those redirects. Any ideas? Suggestions? Thanks!0 -
I have more than 4000 pages but still have a low trafic. Would love to know more to be better ranked ?
My website is a magazine about travel and fashion. But even if i have a lot of pages, I am still low in ranking. Why ? Thanks for any advice !
International SEO | | ccjourn0 -
Site Ranking in all countries except USA
Hello, I have a site www.apdermatology.com in is ranking #1 for
International SEO | | element8design
"Dermatologist Chelsea Mi" "Dermatologist Chelsea Michigan" In Google in Canada, UK, Australia, Etc.. But in the USA it is on the 4th+ Page, it has been this way for weeks if not months. And does not seem to come up. I originally thought maybe that google was penalizing the site although, it comes up in all other counties. Does anyone have any recommendations how to resolve this, or what the problem may be? Thanks.0 -
International Landing Page Strategy
Hello, I'm looking for some insight in an area that I don't have much experience in - hoping the community can help! We are a healthcare staffing company serving clients in the U.S. (www.bartonassociates.com). We are interested in attracting clients in Australia and New Zealand. I'm wondering if anyone as experience with best practices for doing so (both from an SEO and PPC perspective). Would it be best to purchase .au and .nz domains for these landing pages and link back to our US site for more information (or even recreate a modified version of our US site for .au and .nz). My concern here is duplicate content issues, among other things. Or, would it be better to create Australia and New Zealand focused landing pages on our US site and drive PPC there? My concern here is that we would never get organic traffic from Australia and New Zealand to our US site, in light of the competition. Also, the messaging would be a bit mixed if targeting all three countries. Our core term is "locums" and "locum tenens". Greatly appreciate any insight from you guys. Thanks, Jason
International SEO | | ba_seomoz0 -
IP Redirection vs. cloaking: no clear directives from Google
Hi there, Here is our situation:we need to force an IP Redirection for our US users to www.domain.com and at the same time we have different country-specific subfolders with thei own language such as www.domain.com/fr. Our fear is that by forcing an IP redirection for US IP, we will prevent googlebot (which has an US IP) from crawling our country-specific subfolders. I didn't find any clear directives from Google representatives on that matter. In this video Matt Cutts says it's always better to show Googlebot the same content as your users http://www.youtube.com/watch?v=GFf1gwr6HJw&noredirect=1, but on the other hand in that other video he says "Google basically crawls from one IP address range worldwide because (they) have one index worldwide. (They) don't build different indices, one for each country". This seems a contradiction to me... Thank you for your help !! Matteo
International SEO | | H-FARM0 -
Site Spider/ Crawler/ Scraper Software
Short of coding up your own web crawler - does anyone know/ have any experience with a good bit of software to run through all the pages on a single domain? (And potentially on linked domains 1 hop away...) This could be either server or desktop based. Useful capabilities would include: Scraping (x-path parameters) of clicks from homepage (site architecture) http headers Multi threading Use of proxies Robots.txt compliance option csv output Anything else you can think of... Perhaps an oppourtunity for an additional SEOmoz tool here since they do it already! Cheers! Note:
International SEO | | AlexThomas
I've had a look at: Nutch
http://nutch.apache.org/ Heritrix
https://webarchive.jira.com/wiki/display/Heritrix/Heritrix Scrapy
http://doc.scrapy.org/en/latest/intro/overview.html Mozenda (does scraping but doesn't appear extensible..) Any experience/ preferences with these or others?0