Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Non US site pages indexed in US Google search
-
Hi,
We are having a global site wide issue with non US site pages being indexed by Google and served up in US search results. Conversley, we have US en pages showing in the Japan Google search results.
We currently us IP detect to direct users to the correct regional site but it isn't effective if the users are entering through an incorrect regional page. At the top of each or our pages we have a drop down menu to allow users to manually select their preferred region. Is it possible that Google Bot is crawling these links and indexing these other regional pages as US and not detecting it due to our URL structure?
Below are examples of two of our URLs for reference - one from Canada, the other from the US
/ca/en/prod4130078/2500058/catalog50008/
/us/en/prod4130078/2500058/catalog20038/
If that is, in fact, what is happening, would setting the links within the drop down to 'no follow' address the problem?
Thank you.
Angie
-
John,
Thanks for adding all of these great suggestions - I don't do international that often so the full list of methods isn't always in my conscious awareness!
-
Here's all the things you can do to try geotarget your content for the search bots:
- Register each subfolder as a separate site in Google Webmaster Tools (e.g. example.com/ca/, example.com/us/), and geotarget it (see here).
- Set meta tags or http headers on each page to let Bing know the language and country (see here).
- For duplicate or near-duplicate pages across different English speaking localities, you can try out the hreflang tags to clue Google in that they're the same page, but geotargeting users in different locations. I haven't personally implemented this myself, so I can't speak to how well it works, but you can find more info about it hereand here.
Setting nofollows just stops PageRank from flowing, but bots can still follow these links, so I wouldn't do that.
-
Its absolutely possible that's what's happening. You cannot rely on Google's system being barred from crawling anything on your site, no matter how well you code it. Even if you blocked the URL with nofollow, it would not stop the bot.
Another factor is if all your content is in English (as your URL structure suggests it is). Google does a terrible job of discerning separation of international content when all the content is in the same language, on the same root domain.
Proper separation in a way Google can't confuse is vital. Since I expect you do not intend to change the language across sites, your best action would be to migrate international content to a completely different domain. At the very least you can then use GWT to inform Google that "this domain is for this country", however if you want to be even better off, you'd host that other content on a server in that country.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Using same URL for both "en" and "en-us" hreflang tags
Hi,I have a question. Is it okay if I use the same URL for both "en" and "en-us" hreflang tags? For example, for my en-us page: Is this okay with Google? What are your thoughts on this?
International SEO | | Avid_Demand0 -
Has any one seen negative SEO effects from using Google Translate API
We have a site currently in development that is using the Google Translate API and I am having a massive issue getting screaming frog to crawl and all of our non-native English speaking employees have read through the translated copy in their native language and the general consensus is it reads at a 5th grade level at best. My questions to the community is, has anyone implemented this API on a site and has it a) helped with gaining traffic from other languages/countires and b) has it hurt there site from an SEO standpoint.
International SEO | | VERBInteractive0 -
Issues with Baidu indexing
I have a few issues with one of my sites being indexed in Baidu and not too sure of how to resolve them; 1. Two subdomains were redirected to the root domain, but both (www. and another) subdomains are still indexed after ~4 months. 2. A development subdomain is indexed, despite no longer working (it was taken down a few months back). 3. There's conflicting information on what the best approach is to get HTTPS pages indexed in Baidu and we can't find a good solution. 4. There are hundreds of variations of the home page (and a few other pages) on the main site, where Baidu has indexed lots of parameters. There doesn't appear to be anywhere in their webmaster tools to stop that happening, unlike with Google. I'm not the one who deals directly with this site, but I believe that Baidu's equivalent of Webmaster Tools has been used where possible to correctly index the site. Has anyone else had similar issues and, if so, were you able to resolve them? Thanks
International SEO | | jobhuntinghq0 -
Hreflang tag on every page?
Hello Moz Community, I'm working with a client who has translated their top 50 landing pages into Spanish. It's a large website and we don't have the resources to properly translate all pages at once, so we started with the top 50. We've already translated the content, title tags, URLs, etc. and the content will live in it's own /es-us/ directory. The client's website is set up in a way that all content follows a URL structure such as: https://www.example.com/en-us/. For Page A, it will live in English at: https://www.example.com/en-us/page-a For Page A, it will live in Spanish at https://www.example.com/es-us/page-a ("page-a" may vary since that part of the URL is translated) From my research in the Moz forums and Webmaster Support Console, I've written the following hreflang tags: /> For Page B, it will follow the same structure as Page A, and I wrote the corresponding hreflang tags the same way. My question is, do both of these tags need to be on both the Spanish and English version of the page? Or, would I put the "en-us" hreflang tag on the Spanish page and the "es-us" hreflang tag on the English page? I'm thinking that both hreflang tags should be on both the Spanish and English pages, but would love some clarification/confirmation from someone that has implemented this successfully before.
International SEO | | DigitalThirdCoast0 -
HELP: Incorrect Meta Tag description showing for the wrong search results
Hi Guys I'm stuck here! I have update the hreftags, updated the sitemaps. I have 3 top level domains and my zenory.com site is showing for the home page the wrong meta tag description, as you can see in the attachement the meta tag is showing the new zealand site meta tag description which is for zenory.co.nz Anyone know what might be going on here? I have also fetched the home page through WMT as well and its still returning the same results any advice would be much appreciated! Thanks
International SEO | | edward-may0 -
Redirect the main site to keyword-rich subfolder / specific page for SEO
Hi, I have two questions. Question 1: is it worthwhile to redirect the main site to keyword-rich subfolder / specific page for SEO? For example, my company's webpage is www.example.com. Would it make sense to redirect (301) the main site to address www.example.com/service-one-in-certain-city ? I am asking this as I have learned that it is important for SEO to have keywords in the URL, and I was thinking that we could do this and include the most important keywords to the subfolder / specific URL. What are the pros and cons of this? Should I create folders or pages just the sake of keywords? Question 2: Most companies have their main URL shown as www.example.com when you access their domain. However, some multi-language sites show e.g. www.example.com/en or www.example.com/en/main when you type the domain to your web browser to access the site. I understand that this is a common practice to use subdomains or folders to separate different language versions. My question is regarding subfolders. Is it better to have only the subfolder shown (www.example.com/en) or should I also include the specific page's URL after the subfolder with keywords (www.example.com/en/main or www.example.com/en/service-one-in-certain-city)? I don't really understand why some companies show only the subfolder of a specific language page and some the page's URL after the subfolder. Thanks in advance, Sam
International SEO | | Awaraman1 -
Poor Google.co.uk ranking for a UK based .net, but great Google.com
I run an extremely popular news & community website at http://www.onedirection.net, but we're having a few ranking issues in Google.co.uk. The site gets most of its traffic from the USA which isnt a bad thing - but for our key term "one direction", we currently don't rank at all on Google.co.uk. The site is located on a server based in Manchester, UK, and we used to rank very well earlier this year - fluttering about in position 5-7 most of the time. However earlier this year, around July, we started to fall down to page 2 or 3, and at the start of this month we don't rank at all for "one direction" on Google.co.uk. On Google.com however we're very strong, always on page one. We're definitely indexed on .co.uk, just not for main search term - which I find a bit frustrating. All the content on our site is unique, and we write 2-4 stories every day. We have an active forum too, so a lot of our content is user-generated. We've never had any "unnatural link building" messages in Webmaster Tools, and our link profile looks fine to me. Do we just need more .co.uk links, or are we being penalised for something? (I can't imagine what though). It certainly seems that way though. Another site, "www.onedirection.co.uk" which is never updated and has a blatant ad for something completely unrelated on its homepage, ranks above us at the moment- which I find quite frankly appalling as our site is pretty much regarded as the worlds most popular One Direction news and fan site. We've spent the last few months improving the page-load times of our site, and we've reduced any unneccesary internal linking on the site. Approx 2 months ago we launched a new forum on the site, 301'ing all the old forum links to the new one, so that could have had an impact on rankings - but we'd expect to see an impact on Google.com as well if this was an issue. We definitely feel that we should be ranking higher on Google.co.uk. Does anyone have any ideas what the iproblems could be? Cheers, Chris.
International SEO | | PixelKicks0 -
Do non-english(localized) URLs help Local SEO and user experience?
Hi Everyone, This question is about URL best practice for multilingual websites. We have www.example.com in English and we are building the exact replica of English site in German www.example.de. On the Geman site, we are considering to translate some portions of the URLs for example last folder and file name as seen below: example.de/folder1-in-english/folder2-in-english/folder3-in-german/filename-in-german.html Is this a good idea? Will this help SEO and user experience both? or the mixed languagues in URL will confuse the users? Google guidelines say that this should be ok. Would love to get feedback from SEOMOZ community! Thanks, Supriya.
International SEO | | Amjath0