International SEO - How do I show correct SERP results in the UK and US?
-
Hi, Moz community.
I hope you’re all OK and keeping busy during this difficult period. I have a few questions about international SEO, specifically when it comes to ranking pages in the UK and the US simultaneously. We currently have 2 websites set-up which are aimed towards their respective countries. We have a ‘.com’ and a ‘.com/us’.
If anybody could help with the issues below, I would be very grateful. Thank you all.
Issues
-
When looking in US Google search with a VPN, the title tag for our UK page appears in the SERP e.g. I will see:
-
UK [Product Name] | [Brand]
-
When checking the Google cache, the UK page version also appears
-
This can cause a problem especially when I am creating title tags and meta descriptions that are unique from the UK versions
-
However, when clicking through from the SERP link to the actual page, the US page appears as it should do. I find this very bizarre that it seems to show you the US page when you click through, but you see the UK version in the SERP when looking in the overall search results.
Current Set-Up
-
Our UK and US page content is often very similar across our “.com” and “.com/us” websites and our US pages are canonicalised to their UK page versions to remove potential penalisation
-
We have also added herflang to our UK and US pages
Query
- How do I show our US SERP as opposed to the UK version in US Google search?
My Theories/ Answers
-
US page versions have to be completely unique with content related to US search intent and be indexed separately - therefore no longer canonicalised to UK version
-
Ensure hreflang is enabled to point Google to correct local page versions
-
Ensure local backlinks point to localised pages
If anyone can help, it will be much appreciated. Many thanks all.
-
-
Same to you! Happy to help!
-
Thank you for taking the time to help me with all of my questions Kate. It is refreshing to know that experienced SEO marketers like yourself are happy to help others build their knowledge.
I hope you have a good weekend!
-
Yeah, that is actually what hreflang was intended to be. Just to differentiate content pages that had the same content just translated, even if in just dialect. Alas it is also used to show geo-targeting, but I try to not be mad about it
Change as much as needed to make the target market user comfortable. There is no hard and fast rule.
-
Thanks again Kate. This makes sense to me now and it seems to be a nice, easy method. I just have one final question when it comes to differentiating content between UK and US pages.
If we have a page that is relatively similar in terms of content, but the language has been amended to match the local dialect, will this remove the duplication issue if hreflang is in place?
Say, for example, there are 5 key features about a product on a page, and 3 of them are suited to both the US and UK markets. Is it enough to add localised spellings to each description, or would the entire paragraph have to be re-written from scratch to create 2 unique copies?
I see that some competitors re-write their content entirely which makes sense if they're appealing to differing local user intent but some only alter the spellings and their price points where needed. What are your thoughts on this?
Thanks
Katarina -
If the page is https://www.example.com/us/product/ then the hreflang on that page should be:
If it is on https://www.example.com/product/ then it is actually the same
The other two lines are not needed. x-default is for your homepage when there is no target and you are asking users to set their target. If you visit https://www.ikea.com/ in an incognito window, you'll see what I mean.
And general en is not needed here. You are using hreflang for helping the SEs understand the difference in the content across countries that use the same language. As much as I hate it for that purpose, they do use this as a signal. General "en" is if you had a business that didn't geo-target and rather just had translations. One page in English, one in Spanish, etc. But no localization.
-
Hi Kate!
Thanks for your response, I really appreciate the help. What you say makes a lot of sense. The reason we are opting for US and UK sites is that we offer different package and pricing information to each market so it was important to have a distinction between the two.
One thing that is very new to me, however, is the use of hreflang. Here is a sample of what we currently have on our UK and US pages:
I wasn't sure whether we needed to only include the emboldened line of code on US pages. Are the other 3 lines necessary? The same layout appears on our UK pages also.
Thanks in advance!
-
Hi Katarina!
Your theories are right but let me explain a little more.
-
US page versions have to be completely unique with content related to US search intent and be indexed separately - therefore no longer canonicalised to UK version.
If you are going to create a US and UK version of your page, there needs to be a reason why. If there is no reason why other than "someone told us we should," then only do one page. If there is a reason like differing product information then the pages need to be distinct from each other. -
Ensure hreflang is enabled to point Google to correct local page versions
This is blended with what you said above. If you use a canonical and hreflang, the engines will get confused. You are telling them with the canonical that they are the same page. Then the hreflang tells them that the pages are different because of localization. You can't have both. Remove the canonical and make sure the hreflang is right. -
Ensure local backlinks point to localised pages.
Yes!
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What's the best international URL strategy for my non-profit?
Hi, I have a non-profit organization that advocates for mental health education and treatment. We are considering creating regional chapters of the non-profit in specific countries - France, UK, Russia, etc. What's the best long-term foundation for global organic growth? Should we simply internationalize our content (.org/uk/)? Or create a custom site for each ccTLD (.org.uk, etc.? Since it's an educational site, the content for each country would not be particularly unique, apart from: Language (regional English nuance for UK and AUS, or other languages altogether) Expert videos and potentially supporting articles (i.e., hosting videos and a supporting article for a UK Doctor versus a US Doctor) Offering some regional context when it comes to treatment options, or navigating school, work, etc. Any thoughts would be much appreciated! Thanks! Aaron
Local Website Optimization | | RSR1 -
City Pages for Local SEO
Hey Mozzers, I have a local SEO question for you. I am working with a medical professional to SEO their site. I know that when creating city pages, you want to try and make each page as strong as you can, showcasing testimonials from people who live in those towns, for instance. Since my client is in the medical profession, i was going to include a list of parks from that town and say something about how, "we want to encourage good health, etc." However, i began to wonder whether i should just create one, large resource for the surrounding towns having to do with parks, dog parks, and athletic activities and link to it in the top nav. thoughts? Nails
Local Website Optimization | | matt.nails0 -
Is there an SEO benefit to using tags in WordPress for my blog posts?
We have locations across the US and are trying to develop content so that we rank well for specific keywords on a local level. For instance, "long tail keyword search in state" or "long tail keyword search near 76244", etc. The goal is to develop those content pages via blogs to rank for those keywords. We are using Yoast and will be optimizing each post using that tool. My questions are: 1. Are there any benefits to adding a long list of tags to each post?
Local Website Optimization | | Smart_Start
2. If yes, do I need to limit the number of tags?
3. Do we need to block the indexing of yoast to those tags and categories for duplicate content issues? Any insight on the best way to optimize these blog posts with the use of tags or other avenues would be greatly appreciated.0 -
SEO Client not rankings in Google
Hello, I have a client that has continued to be problematic for my team and I. They have fair to middling rankings in Yahoo and Bing, but none in Google. I realize that they are three separate search engines each with their own criteria, but this client is the only one experiencing this problem. There is no significant duplicate content that can find, same with restrictions in the robots.txt file. These seems to be no reason why all my tools say that this client has no presence at all in google, especially when the client gains most of their traffic through Google. Can anyone assist me in finding out what is going wrong? Client website for reference: http://www.volvethosp.com/ Best, BeyondIndigo
Local Website Optimization | | BeyondIndigo0 -
Server response time: restructure the site or create the new one? SEO opinions needed.
Hi everyone, The internal structure of our existing site increase server response time (6 sec) which is way below Google 0.2sec standards and also make prospects leave the site before it's loaded. Now we have two options (same price): restructure the site's modules, panels etc create new site (recommended by developers)
Local Website Optimization | | Ryan_V
Both options will extend the same design and functionality. I just wanted to know which option SEO community will recommend?0 -
SEO and Redirecting Site to a Different Firm's Domain while Maintaining Current Domain's Rankings
I am a plaintiffs' attorney with a website that ranks well for my major practice areas. I am considering taking a position with a new firm. As part of the discussion, the new firm would allow me to keep my current site so long as it redirects to my bio page on their firm's site. My goal is to keep my current site ranking well and continuously work on SEO efforts, in case I leave the new firm and want to rely on my current site in the future. My questions are: Is there a way to redirect my site every time it shows up in the listings (I have 1000+ indexed pages) without sacrificing its current rankings b/c of bounce rate issues, etc and 2) If I continue to add pages and work on SEO for my site while it redirects to another, will those efforts be worthwhile due to the redirect? I want to keep trying to build my site even though it redirects to a page on a different domain.
Local Website Optimization | | crpoll0 -
Local SEO HELP for Franchise SAB Business
This all began when I was asked to develop experiment parameters for our content protocol & strategy. It should be simple right? I've reviewed A/B testing tips for days now, from Moz and other sources.I'm totally amped and ready to begin testing in Google Analytics. Say we have a restoration service franchise with over 40 franchises we perform SEO for. They are all over the US. Every franchise has their own local website. Example restorationcompanylosangeles.com Every franchise purchases territories in which they want to rank in. Some service over 100 cities. Most franchises also have PPC campaigns. As a part of our strategy we incorporate the location reach data from Adwords to focus on their high reach locations first. We have 'power pages' which include 5 high reach branch preferences (areas in which the owners prefer to target) and 5 non branch preference high reach locations. We are working heavily on our National brand presence & working with PR and local news companies to build relationships for natural backlinks. We are developing a strategy for social media for national brand outlets and local outlets. We are using major aggregators to distribute our local citation for our branch offices. We make sure all NAP is consistent across all citations. We are partners with Google so we work with them on new branches that are developing to create their Google listings (MyBusiness & G+). We use local business schema markup for all pages. Our content protocol encompasses all the needed onsite optimization tactics; meta, titles, schema, placement of keywords, semantic Q&A & internal linking strategies etc. Our leads are calls and form submissions. We use several call tracking services to monitor calls, caller's location etc. We are testing Callrail to start monitoring landing pages and keywords that generating our leads. Parts that I want to change: Some of the local sites have over 100 pages targeted for 'water damage + city ' aka what Moz would call "Doorway pages. " These pages have 600-1000 words all talking about services we provide. Although our writers (4 of them) manipulate them in a way so that they aren't duplicate pages. They add about 100 words about the city location. This is the only unique variable. We pump out about 10 new local pages a month per site - so yes - over 300 local pages a month. Traffic to the local sites is very scarce. Content protocol / strategy is only tested based on ranking! We have a tool that monitors ranking on all domains. This does not count for mobile, local, nor user based preference searching like Google Now. My team is deeply attached to basing our metrics solely on ranking. The logic behind this is that if there is no local city page existing for a targeted location, there is less likelihood of ranking for that location. If you are not seen then you will not get traffic nor leads. Ranking for power locations is poor - while less competitive low reach locations rank ok. We are updating content protocol by tweaking small things (multiple variants at a time). They will check ranking everyday for about a week to determine whether that experiment was a success or not. What I need: Internal duplicate content analyzer - to prove that writing over 400 pages a month about water damage + city IS duplicate content. Unique content for 'Power pages' - I know based on dozens of chats here on the community and in MOZ blogs that we can only truly create quality content for 5-10 pages. Meaning we need to narrow down what locations are most important to us and beef them up. Creating blog content for non 'power' locations. Develop new experiment protocol based on metrics like traffic, impressions, bounce rate landing page analysis, domain authority etc. Dig deeper into call metrics and their sources. Now I am at a roadblock because I cannot develop valid content experimenting parameters based on ranking. I know that a/b testing requires testing two pages that are same except the one variable. We'd either non index these or canonicalize.. both are not in favor of testing ranking for the same term. Questions: Are all these local pages duplicate content? Is there a such thing as content experiments based solely on ranking? Any other suggestions for this scenario?
Local Website Optimization | | MilestoneSEO_LA1 -
Some products not shown in SERP
I have some products, which are not shown in Google SERP. Here is a link for basic SERP report, here is another for the product on my site. MOZ's Grade Tool: A Why isn't my site showing up on Google results pages for this keyword? Thank you!
Local Website Optimization | | adriankoooo0