Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
US domain pages showing up in Google UK SERP
-
Hi,
Our website which was predominantly for UK market was setup with a .com extension and only two years ago other domains were added - US (.us) , IE (.ie), EU (.eu) & AU (.com.au)
Last year in July, we noticed that few .us domain urls were showing up in UK SERPs and we realized the sitemap for .us site was incorrectly referring to UK (.com) so we corrected that and the .us domain urls stopped appearing in the SERP. Not sure if this actually fixed the issue or was such coincidental.
However in last couple of weeks more than 3 .us domain urls are showing for each brand search made on Google UK and sometimes it replaces the .com results all together. I have double checked the PA for US pages, they are far below the UK ones.
Has anyone noticed similar behaviour &/or could anyone please help me troubleshoot this issue?
Thanks in advance,
R
-
As your own agency told, I too consider that when the hreflang will be implemented, this kind of issues should terminate.
Regarding the sitemap error, it was surely something that could be confusing Google about what site to target.
However, I see that you have also an .eu domain name...
I imagine that that domain is meant for targeting the European market and I suspect that it is in English.
If it is so, remember:
- In countries like Spain, France, Germany, italy... we don't search in Internet using English, but Spanish, French, German, Italian... Therefore, that .eu domain is not going to offer you those results you maybe are looking for;
- The .eu domain termination is a generic one, and cannot be geotargeted via Google Search Console. This means that - by default - it targets all the world, hence, you probably can see visits from English speaking users in countries like South Africa, UK, IE, Australia, New Zealand or India, where English is the main language or one of the official ones;
- When it comes to domains like .eu and hreflang, it is always hard to decide how to implement it. In your specific case, as you are targeting UK, US, AU and IE with specific domain names, the ideal would be to implement this hreflang annotation for the .eu (the example is only for the home page):
<rel="alternate" href="http://www.domain.eu" hreflang="x-default"><rel="alternate" href="http://www.domain.eu" hreflang="en"><rel="alternate" href="http://www.domain.com" hreflang="en-GB"><rel="alternate" href="http://www.domain.us" hreflang="en-US"><rel="alternate" href="http://www.domain.com.au" hreflang="en-AU"></rel="alternate"></rel="alternate"></rel="alternate"></rel="alternate"></rel="alternate">
With those annotations, you are telling Google to show the .com to users in Great Britain, the .us to users in United States, the .au to Australian ones and the .eu to all the other users using English in any other country.
That will mean that your .eu site surely will target also users in others European countries, both using english when searching (hreflang="en") and other languages (hreflang="x-default").
2 notes about the hreflang="x-default":
-
People living in the UK and searching in Spanish will see the .eu domain name, because it is the default domain name for searches in every language but English in GB, IE, AU and US;
-
Again, even if you pretend the .eu domain to target only European countries, that is impossible, because the .eu termination doesn't have any geotargeting power (and regions like Europe or Asia cannot be geotargeted via GSC). So it will be normal to see visit also from countries in others continents.
-
You're very welcome. Either way I'd be interested to see how this one progresses.
-
Hi Chris,
Thanks for your quick response and detailing out this well.
I have backdated and noticed that this occurs almost every six months. The US domain urls pop up in the UK SERPs for about 2 weeks and disappear after that. We are yet to implement the href lang tags on site and our SEO agency confirm that this should fix the issue.
Will keep this thread updated on the outcome.
Cheers,
RG
-
Whether or not this is an issue kind of depends on what your product or service is. If you provide a local-only service like a restaurant then your US site ranking in the UK would be unusual.
On the other hand, if you sell a physical product this may not be so unusual. For example, here in Australia we're quite limited when it comes to finding men's online clothing stores, most of it comes from the US or the UK so it's not uncommon to see something like the US Jackthreads show up in the SERPs here.
Since you do have separate domains for each location, this might be an indication that search engines aren't really understanding the different jurisdictions for each site; maybe they're not geo-targeted enough for the algorithm to comprehend the fact that each of the 3 sites server a unique area.
Some of the elements that can help define this, in no particular order:
- Server location
- HTML language ( e.g. lang="en-US")
- Regional language differences (e.g. US spelling vs UK)
- Location markup - on your location pages at the very least
- Location mentions throughout your content
While not specifically on-topic, Rand's Whiteboard Friday about scaling geo-targeting offers plenty of great advice that can be applied here as well.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
More pages or less pages for best SEO practices?
Hi all, I would like to know the community's opinion on this. A website with more pages or less pages will rank better? Websites with more pages have an advantage of more landing pages for targeted keywords. Less pages will have advantage of holding up page rank with limited pages which might impact in better ranking of pages. I know this is highly dependent. I mean to get answers for an ideal website. Thanks,
Algorithm Updates | | vtmoz1 -
Best and easiest Google Depersonalization method
Hello, Moz hasn't written anything about depersonalization for years. This article has methods, but I don't know if they are valid anymore. What's an easy, effective way to depersonalize Google search these days? I would just log out of Google, but that shows different ranking results than Moz's rank tracker for one of our main keywords, so I don't know if that method is correct. Thanks
Algorithm Updates | | BobGW0 -
Exact Keywords Domain name
Hello everyone!, I would love to have your opinion on this matter. I am working on a company e-commerce site; these guys would like to change their domain name AND their company name, so the most logical thing that came to mind was to name the domain after the company name. However, they also bought in the past a domain that have the exact keyword they would like to rank for. I know that keywords in the URL are not as important as they used to be in the past, but nonetheless when I do a Google search for those keywords, 3 domains out of 10 on the first page are slight variations of those same keywords, meaning that they might have a really good domain name (also the other result are government, medical stuff and so on). And, no matter how many times I have read that keywords in the URL are not so important anymore, I still see a lot of sites ranking also because of their domain name (well at least outside the US) So, my question here is: would it be better for them to use the exact match keyword-domain name or should they use their company name for their new site? Or some sort combination of the two? (the keyword-domain that in some way points also to the brand domain). Thanks for your opinions on this; really appreciate it! Cheers
Algorithm Updates | | Eyah0 -
Google sets brand/domain name at the end of SERP titles
Hi all, I am experiencing that Google puts our domain name at the end of the titles in SERPs. So if ia have a title: "See our super cool website", Google would show "See our super cool website - Betxpert.com" in the SERPs Well. This is okay. Apart from the fact that i myself often put the brand name in the title AND the fact that Google mispells the site name. The brand is BetXpert with a upper case X...so when i get a SERP with "See our super cool website - BetXpert - Betxpert.com" I am annoyed 🙂 Any one out the know how to tell Google the EXACT brand name, such that they do not set a value the site owner does not want to have? -Rasmus
Algorithm Updates | | rasmusbang0 -
Does a KML file have to be indexed by Google?
I'm currently using the Yoast Local SEO plugin for WordPress to generate my KML file which is linked to from the GeoSitemap. Check it out http://www.holycitycatering.com/sitemap_index.xml. A competitor of mine just told me that this isn't correct and that the link to the KML should be a downloadable file that's indexed in Google. This is the opposite of what Yoast is saying... "He's wrong. 🙂 And the KML isn't a file, it's being rendered. You wouldn't want it to be indexed anyway, you just want Google to find the information in there. What is the best way to create a KML? Should it be indexed?
Algorithm Updates | | projectassistant1 -
Stop google indexing CDN pages
Just when I thought I'd seen it all, google hits me with another nasty surprise! I have a CDN to deliver images, js and css to visitors around the world. I have no links to static HTML pages on the site, as far as I can tell, but someone else may have - perhaps a scraper site? Google has decided the static pages they were able to access through the CDN have more value than my real pages, and they seem to be slowly replacing my pages in the index with the static pages. Anyone got an idea on how to stop that? Obviously, I have no access to the static area, because it is in the CDN, so there is no way I know of that I can have a robots file there. It could be that I have to trash the CDN and change it to only allow the image directory, and maybe set up a separate CDN subdomain for content that only contains the JS and CSS? Have you seen this problem and beat it? (Of course the next thing is Roger might look at google results and start crawling them too, LOL) P.S. The reason I am not asking this question in the google forums is that others have asked this question many times and nobody at google has bothered to answer, over the past 5 months, and nobody who did try, gave an answer that was remotely useful. So I'm not really hopeful of anyone here having a solution either, but I expect this is my best bet because you guys are always willing to try.
Algorithm Updates | | loopyal0 -
Home page replaced by subpage in google SERP (good or bad)
SInce Panda, We have seen our home page drop from #2 in google.ie serp to page 3 but it has been replaced in the same position @#2 by our relevent sub page for the keyword that we ranked#2 for. Is this a good or bad thing from and seo point of view and is it better to have deep pages show in serp rather than the homepage of a site and what is the best line of action from here in relation to seo. Is it best to work on subpage or home page for that keyword and should link building for that phrase be directed towards the subpage or the homepage as the subpage is obviously more relevent in googles eyes for the search term. It is clear that all areas of the site should be looked at in relation to link building and deep links etc but now that google is obviously looking at relevancy very closely should all campaigns be sectioned into relevent content managed sections and the site likewise and treated on an individual basis. Any help that you may have would be very welcome. Paul
Algorithm Updates | | mcintyr0 -
How long does a news article stay on Google's 'News' section on the SERP?
Our site is recognised as a news source for our niche - was just wondering if anyone had any idea how long the news story stays on the front page of the SERP once Google picks it up?
Algorithm Updates | | DanHill0