Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
International SEO - How do I show correct SERP results in the UK and US?
-
Hi, Moz community.
I hope you’re all OK and keeping busy during this difficult period. I have a few questions about international SEO, specifically when it comes to ranking pages in the UK and the US simultaneously. We currently have 2 websites set-up which are aimed towards their respective countries. We have a ‘.com’ and a ‘.com/us’.
If anybody could help with the issues below, I would be very grateful. Thank you all.
Issues
-
When looking in US Google search with a VPN, the title tag for our UK page appears in the SERP e.g. I will see:
-
UK [Product Name] | [Brand]
-
When checking the Google cache, the UK page version also appears
-
This can cause a problem especially when I am creating title tags and meta descriptions that are unique from the UK versions
-
However, when clicking through from the SERP link to the actual page, the US page appears as it should do. I find this very bizarre that it seems to show you the US page when you click through, but you see the UK version in the SERP when looking in the overall search results.
Current Set-Up
-
Our UK and US page content is often very similar across our “.com” and “.com/us” websites and our US pages are canonicalised to their UK page versions to remove potential penalisation
-
We have also added herflang to our UK and US pages
Query
- How do I show our US SERP as opposed to the UK version in US Google search?
My Theories/ Answers
-
US page versions have to be completely unique with content related to US search intent and be indexed separately - therefore no longer canonicalised to UK version
-
Ensure hreflang is enabled to point Google to correct local page versions
-
Ensure local backlinks point to localised pages
If anyone can help, it will be much appreciated. Many thanks all.
-
-
Same to you! Happy to help!
-
Thank you for taking the time to help me with all of my questions Kate. It is refreshing to know that experienced SEO marketers like yourself are happy to help others build their knowledge.
I hope you have a good weekend!
-
Yeah, that is actually what hreflang was intended to be. Just to differentiate content pages that had the same content just translated, even if in just dialect. Alas it is also used to show geo-targeting, but I try to not be mad about it
Change as much as needed to make the target market user comfortable. There is no hard and fast rule.
-
Thanks again Kate. This makes sense to me now and it seems to be a nice, easy method. I just have one final question when it comes to differentiating content between UK and US pages.
If we have a page that is relatively similar in terms of content, but the language has been amended to match the local dialect, will this remove the duplication issue if hreflang is in place?
Say, for example, there are 5 key features about a product on a page, and 3 of them are suited to both the US and UK markets. Is it enough to add localised spellings to each description, or would the entire paragraph have to be re-written from scratch to create 2 unique copies?
I see that some competitors re-write their content entirely which makes sense if they're appealing to differing local user intent but some only alter the spellings and their price points where needed. What are your thoughts on this?
Thanks
Katarina -
If the page is https://www.example.com/us/product/ then the hreflang on that page should be:
If it is on https://www.example.com/product/ then it is actually the same
The other two lines are not needed. x-default is for your homepage when there is no target and you are asking users to set their target. If you visit https://www.ikea.com/ in an incognito window, you'll see what I mean.
And general en is not needed here. You are using hreflang for helping the SEs understand the difference in the content across countries that use the same language. As much as I hate it for that purpose, they do use this as a signal. General "en" is if you had a business that didn't geo-target and rather just had translations. One page in English, one in Spanish, etc. But no localization.
-
Hi Kate!
Thanks for your response, I really appreciate the help. What you say makes a lot of sense. The reason we are opting for US and UK sites is that we offer different package and pricing information to each market so it was important to have a distinction between the two.
One thing that is very new to me, however, is the use of hreflang. Here is a sample of what we currently have on our UK and US pages:
I wasn't sure whether we needed to only include the emboldened line of code on US pages. Are the other 3 lines necessary? The same layout appears on our UK pages also.
Thanks in advance!
-
Hi Katarina!
Your theories are right but let me explain a little more.
-
US page versions have to be completely unique with content related to US search intent and be indexed separately - therefore no longer canonicalised to UK version.
If you are going to create a US and UK version of your page, there needs to be a reason why. If there is no reason why other than "someone told us we should," then only do one page. If there is a reason like differing product information then the pages need to be distinct from each other. -
Ensure hreflang is enabled to point Google to correct local page versions
This is blended with what you said above. If you use a canonical and hreflang, the engines will get confused. You are telling them with the canonical that they are the same page. Then the hreflang tells them that the pages are different because of localization. You can't have both. Remove the canonical and make sure the hreflang is right. -
Ensure local backlinks point to localised pages.
Yes!
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What's your proudest accomplishment in regards to SEO?
After many years in the industry, you come to realize a few things. One of of the biggest pain points for us at web daytona was being able to give clients a quick keyword ranking cost estimation. After multiple trial and error and relying on API data from one of the most reliable SEO softwares in our industry, we were able to develop an SEO tool that allows us to quickly and accurately get the estimated cost for a given keyword (s) using multiple variables. Most agencies can relate to that story. It’s something my colleagues and I at Web Daytona have been through before. Finding the cost and amount of time needed to rank for a keyword is a time consuming process. That’s why it’s a common practice to sell SEO packages of 5-10 keywords for about $1000-2000 / month. The problem is not all keywords are equally valuable, and most clients know this. We constantly get questions from clients asking: “how much to rank for this specific keyword?” It’s difficult to answer that question with a pricing model that treats the cost of ranking every keyword equally. So is the answer to spend a lot more time doing tedious in-depth keyword research? If we did we could give our clients more precise estimates. But being that a decent proposal can take as long as 2-5 hours to make, and agency life isn’t exactly full of free time, that wouldn’t be ideal. That’s when we asked a question. What if we could automate the research needed to find the cost of ranking keywords? We looked around for a tool that did, but we couldn’t find it. Then we decided to make it ourselves. It wasn’t going to be easy. But after running an SEO agency for over a decade, we knew we had the expertise to create a tool that wouldn’t just be fast and reliable, it would also be precise. Fast forward to today and we’re proud to announce that The Keyword Cost Estimator is finally done. Now we’re releasing it to the public so other agencies and businesses can use it too. You can see it for yourself here. Keyword-Rank-Cost-Ectimator-Tool-by-Web-Daytona-Agency.png
Local Website Optimization | | WebDaytona0 -
Local SEO Over Optimization
We are targeting a bunch of services for our local business that works in and around their location. I'm concerned about over optimization and need some guidance on whether these points should be resolved. The company is based in a city and works mostly in the city but also in the surrounding areas. Currently, the site has 6 services pages (accessible via main nav) targeting the same location i.e. “Made Up Service London”, “Imaginary Service London” (with URLs and H1 tags etc. in place containing this location). However this is soon going to become 9 services pages, I am concerned that the repetition of this one location is starting to look spammy, especially as its where the company is based. Initially, I also wanted pages targeting the same services in other nearby areas. For example “Made Up Service Surrey”, “Imaginary Service Essex”. This has not happened as the info available has been too sporadic. I was going to add links to relevant case studies into these pages to beef up the content and add interest. To that end, we came up with case studies, but after a while, I noticed that these are also largely focused on the primary location. So out of 32 case studies, we have 19 focused on the primary location again with URL’s and H1 tags etc containing the location keyword. So in total, we have 25 pages optimized for the location (soon to be 28 and more if further case studies are added). My initial feeling was that the inclusion of pages targeting services in other locations would legitimize what we have done with the main pages. But obviously we have not got these pages in place and I question whether we ever will. What is my best course of action moving forward?
Local Website Optimization | | GrouchyKids1 -
Can PPC harm SEO results, even if it's off-domain?
Here's the scenario. We're doing SEO for a national franchise business. We have over 60 location pages on the same domain, that we control. Another agency is doing PPC for the same business, except they're leading people to un-indexable landing pages off domain. Apparently they're also using location extensions for the businesses that have been set up improperly, at least according to the Account Strategists at Google that we work with. We're having a real issue with these businesses ranking in the multi-point markets (where they have multiple locations in a city). See, the client wants all their location landing pages to rank organically for geolocated service queries in those cities (we'll say the query is "fridge repair"). We're trying to tell them that the PPC is having a negative effect on our SEO efforts, even though there shouldn't be any correlation between the two. I still think the PPC should be focused on their on-domain location landing pages (and so does our Google rep), because it shows consistency of brand, etc. I'm getting a lot of pushback from the client and the other agency, of course. They say it shouldn't matter. Has anyone here run into this? Any ammo to offer up to convince the client that having us work at "cross-purposes" is a bad idea? Thanks so much for any advice!
Local Website Optimization | | Treefrog_SEO0 -
Which is the best, ".xx" or ".com.xx" in general and for SEO?
Hi, I'm working for a digital marketing agency and have traffic from different countries. We are planning to make different websites for each country. What is the best SEO practice to choose the domain between ".xx" or ".com.xx" from Spain, Mexico, Chile, Colombia and Peru?
Local Website Optimization | | NachoRetta
I think that the ccTLD is better always, for example ".es" better than ".com.es"0 -
Subdomain versus Subfolder for Local SEO
Hello Moz World, I'm wanting to know the best practices for utilizing a subdomain versus a subfolder for multi location businesses, i.e. miami.example.com vs. example.com/miami; I would think that that utilizing the subdomain would make more sense for a national organization with many differing locations, while a subfolder would make more sense for a smaller more nearby locations. I wanted to know if anyone has any a/b examples or when it should go one way or another? Thank you, Kristin Miller
Local Website Optimization | | Red_Spot_Interactive0 -
SEO Value in Switching to ".NYC" Domain?
Recently " .NYC" domains have become available for purchase to New York City based businesses. I own and operate a New York City commercial real estate firm, nyc-officespace-leader.com. New domain would be www.metro-manhattan.nyc Our existing domain has been in use for seven years.would there be an SEO benefit to transferring our site to .NYC domain? Or would a new domain kill our domain rank? Thanks, Alan
Local Website Optimization | | Kingalan10 -
Yoast Local SEO Reviews/Would it work for me?
Hi everyone, I'm looking for some feedback on Yoast Local SEO, and if you think it'd work for our site. www.kempruge.com. Our site is a wordpress site, and there's nothing about it, off the top of my head, that makes me think it wouldn't work, but I've been wrong before. We do use All-In-One SEO, not the Yoast plugin, so I'm not sure if that's compatible.or would cause a problem? (The reason we use All-In-One and not Yoast is because that's what we had when I got here, and I'm worried what would happen if we switched). Also, we have three offices, and I need to be able to do local seo for all three. I know Yoast says it supports multiple offices, but I'd feel more comfortable if someone on here let me know from his/her experience that it did. Anything else you want to add about Yoast Local, I'm all ears! Thanks, Ruben
Local Website Optimization | | KempRugeLawGroup0 -
International Site Geolocation Redirection (best way to redirect and allow Google bots to index sites)
I have a client that has an international website. The website currently has IP detection and redirects you to the subdomain for your country. They have currently only launched the Australian website and are not yet open to the rest of the world: https://au.domain.com/ Google is not indexing the Australian website or pages, instead I believe that the bots are being blocked by the IP redirection every time they try to visit one of the Australian pages. Therefore only the US 'coming soon' page is being properly indexed. So, I would like to know the best way to place a geolocation redirection without creating a splash page to select location? User friendliness is most important (so we don't want cookies etc). I have seen this great Whiteboard Friday video on Where to Host and How to Target, which makes sense, but what it doesn't tell me is exactly the best method for redirection except at about 10:20 where it tells me what I'm doing is incorrect. I have also read a number of other posts on IP redirection, but none tell me the best method, and some are a little different examples... I need for US visitors to see the US coming soon page and for Google to index the Australian website. I have seen a lot about JS redirects, IP redirects and .htaccess redirects, but unfortunately my technical knowledge of how these affect Google's bots doesn't really help. Appreciate your answers. Cheers, Lincoln
Local Website Optimization | | LincolnSmith0