Google does not index UK version of our site, and serves US version instead. Do I need to remove hreflanguage for US?
-
Webmaster tools indicates that only 25% of pages on our UK domain with GBP prices is indexed.
We have another US domain with identical content but USD prices which is indexed fine.When I search in google for site:mydomain I see that most of my pages seem to appear, but then in the rich snippets google shows USD prices instead of the GBP prices which we publish on this page (USD price is not published on the page and I tested with an US proxy and US price is nowhere in the source code).
Then I clicked on the result in google to see cached version of page and google shows me as cached version of the UK product page the US product page.
I use the following hreflang code: rel="alternate" hreflang="en-US" href="https://www.domain.com/product" />
rel="alternate" hreflang="en-GB" href="https://www.domain.co.uk/product" />canonical of UK page is correctly referring to UK page.
Any ideas? Do I need to remove the hreflang for en-US to get the UK domain properly indexed in google?
-
Hi Christy, not yet.
-
Hi there, have you been able to figure out this riddle yet -- or are you still working on it? We'd love an update!
Christy
-
Thanks. Will post an update once we figured out this riddle.
-
Worth a try, specifically since nothing else seems to be working at this point.
Sorry I couldn't be more help. Please let us know what the solution is when ever you figure it out.
-
No.
But in theroy it should not make a difference.
hreflang can be either implemented in sitemap or in page. -
Do your sitemaps indicate the varying languages. See: https://support.google.com/webmasters/answer/2620865?hl=en
-
Thanks for checking. Yes, rich snippets are frequently not showing currently, however this is a different issue and it may be temporary.
My primary concern is rather that google does not index our sitemap for this domain according to search console and related that they show the .com page as cached version of the .uk page.
-
Very odd, when we do a search for "Adorini Firenze - Deluxe" in google UK with a VPN on for the UK it doesn't get any of the price schema markup. Maybe google is having a hard time deciding what to do with the schema so for normal results it doesn't pull any of the schema. What KW are ranking for this page? do you get similar results as us?
-
Happens already for many months.
Good idea to test with VPN. I just gave it a try with a UK proxy and same result.
-
odd, looks to pull the cached schema, the issue might correct itself with time.
It's a bit of a long shot, but have you tried the lang/country codes completely in lower case? I doubt that will fix it but the tool in this Moz article I found (https://moz.com/blog/using-the-correct-hreflang-tag-a-new-generator-tool) generates them in all lower case.
Also out of curiosity are you doing your test searches through a VPN?
https://moz.com/blog/using-the-correct-hreflang-tag-a-new-generator-tool
-
Sure, click on cached version of the first result in the following google search. Also you see here the rich snippets in USD instead of GBP:
-
Well at least we checked off one thing that it is not.
Can you provide a link to the SERP where you are seeing the issue?
-
everything looks fine in google search console.
no hreflang errors, no sitemap errors and google crawls every day basically all our pages for many months already.
-
Odd, are you seeing any errors in Google Search Console (use to be Google webmaster tools) under Search Traffic/International Targeting. It will show any hreflang errors, I would start there and fix any errors.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Why Doesn't Google Use My Title Tag and Meta Description?
Hi fellow Moz SEOs, Need your URGENT help! We set an optimised title & meta description for our client websites. These titles are approved by our clients. Before somedays, they checked on Google, noticed the title & meta description were not the same. Next moment, they notified me about this issue. The title & meta description looks fine when I checked the source code. So, why Google use title & meta description differently? For example: Title approved by client: Top Specialist Divorce & Family Lawyer - Yeo & Associates LLC
International SEO | | Verz
Google set our title: Yeo & Associates LLC: Top Specialist Divorce & Family Lawyer Title approved by client: Filing For Divorce Online in Singapore | DivorceBureau®
Google set our title: DivorceBureau®: Filing For Divorce Online in Singapore Title approved by client: Halal Buffet & Bento/Packet Meals Event Caterer Singapore | Foodtalks
Google set our title: Foodtalks - Halal Buffet & Bento/Packet Meals Event Caterer Singapore Title approved by client: Child Care Centre in Singapore| Top Preschool | Carpe Diem
Google set our title: Carpe Diem: Child care Centre in Singapore| Top Preschool Every day, they are requesting me to update Google's title with their approved title. Also, asking me these questions.
Why did this happen?
Why didn't set their recommended title? Is there any way to set our approved titles? Please, help me to find the solution. ASAP Thanks in advance!0 -
Why Google is not indexing each country/language subfolder on the ranks?
Hi folks, We use Magento 2 for the multi-country shops (its a multistore). The URL: www.avarcas.com The first days Google indexed the proper url in each country: avarcas.com/uk avarcas.com/de ... Some days later, all the countries are just indexing / (the root). I correctly set the subfolders in Webmaster tools. What's happening? Thanks
International SEO | | administratorwibee0 -
International Sites and Duplicate Content
Hello, I am working on a project where have some doubts regarding the structure of international sites and multi languages.Website is in the fashion industry. I think is a common problem for this industry. Website is translated in 5 languages and sell in 21 countries. As you can imagine this create a huge number of urls, so much that with ScreamingFrog I cant even complete the crawling. Perhaps the UK site is visible in all those versions http://www.MyDomain.com/en/GB/ http://www.MyDomain.com/it/GB/ http://www.MyDomain.com/fr/GB/ http://www.MyDomain.com/de/GB/ http://www.MyDomain.com/es/GB/ Obviously for SEO only the first version is important One other example, the French site is available in 5 languages and again... http://www.MyDomain.com/fr/FR/ http://www.MyDomain.com/en/FR/ http://www.MyDomain.com/it/FR/ http://www.MyDomain.com/de/FR/ http://www.MyDomain.com/es/FR/ And so on...this is creating 3 issues mainly: Endless crawling - with crawlers not focusing on most important pages Duplication of content Wrong GEO urls ranking in Google I have already implemented href lang but didn't noticed any improvements. Therefore my question is Should I exclude with "robots.txt" and "no index" the non appropriate targeting? Perhaps for UK leave crawable just English version i.e. http://www.MyDomain.com/en/GB/, for France just the French version http://www.MyDomain.com/fr/FR/ and so on What I would like to get doing this is to have the crawlers more focused on the important SEO pages, avoid content duplication and wrong urls rankings on local Google Please comment
International SEO | | guidoampollini0 -
License Details across multiple regional brand sites
Hi guys! I have a quick question. Our team are currently having a debate regarding whether we should display our licensing details as text across all our brands in multiple regions (roughly 50 sites). My argument is that if you are required to have a license to be able to operate legally that Google would EXPECT to be able to crawl those details in order to provide their (Google) users with reliable results as opposed to rogue operators. The other side of the argument is that it will tie all the sites together and that would be a huge risk (as Google will perceive it as a network)- also that it would be seen as duplicate content? Would really appreciate any feedback on what is the best to do in this case. Thanks!!
International SEO | | RedSearch010 -
Geo Targeting SEO Techniques for Google UK
I'm starting a new SEO project whereby I'll be targeting UK search engines only such as Google.co.uk, (I'm from the states) and I'm gathering all the information I can get on this topic Obviously, I got a CO.UK TLD, and hosting/IP is UK based, but can anyone shed light on other techniques that has worked for you, Besides of the above here is some advice I picked up so far; Regional directory listings,
International SEO | | Plorex
Inbound and outbound inks from/to UK based websites,
Geographic targeting in Google webmaster tools,
British slang... What else is there?
Much appreciated0 -
Removing United Kingdom next to a generic TLD
We have a generic top level domain (gTLD) called www.xyz.com which was set to target United Kingdom in Google Webmaster Tools. We have now launched a US version of the site targeting US consumers – www.xyz.com/us and set the geographic target to United States on GWT. When I search for xyz on www.google.com, the serps brings up the .com site with “United Kingdom” beside it. This will most likely confuse our prospects as they would think we only have a UK operation. How can I tell Google not to include “United Kindgom” next to www.xyz.com Any thoughts? Since this was happening, I removed the geographic location target for www.xyz.com to null on GWT. Would that help solve the issue? Look forward to your reply. Many Thanks Jay
International SEO | | jgohil0 -
Site Spider/ Crawler/ Scraper Software
Short of coding up your own web crawler - does anyone know/ have any experience with a good bit of software to run through all the pages on a single domain? (And potentially on linked domains 1 hop away...) This could be either server or desktop based. Useful capabilities would include: Scraping (x-path parameters) of clicks from homepage (site architecture) http headers Multi threading Use of proxies Robots.txt compliance option csv output Anything else you can think of... Perhaps an oppourtunity for an additional SEOmoz tool here since they do it already! Cheers! Note:
International SEO | | AlexThomas
I've had a look at: Nutch
http://nutch.apache.org/ Heritrix
https://webarchive.jira.com/wiki/display/Heritrix/Heritrix Scrapy
http://doc.scrapy.org/en/latest/intro/overview.html Mozenda (does scraping but doesn't appear extensible..) Any experience/ preferences with these or others?0 -
Country specific domains pointing to a .com site
Hello, I am new to seo so please be easy if this happens to be a "silly" question. My company has a .com site. We are expanding into global markets, focusing on specific countries right now. General question: Would I be penalized for duplicate content if I purchased country-specific domains and pointed them to the .com site? Thanks, Jim
International SEO | | jimmer0