Help with international targeting
-
Hi all!
Okay, so we've got a site, let's say example.com - we sell training courses worldwide with a particular focus in just 8 countries.
Historically, we've never targeted users in different countries effectively, we've just got the example.com that floats about ranking in different countries, but our content is dynamic (obviously a big SEO no-no - we pick up the IP of the user and show the content relevant for that country without the URL's changing)
This obviously presents an SEO flaw in that we can effectively target people in our key countries effectively.So, we're introducing the targeting as subfolders (/uk/, /ie/ etc) my questions are:
1. Would this be the correct implementation of hreflang AND canonical tags for the URL: https://www.example.com/es/
2. The second thing I was wondering is the 'international targeting' in search console.
We haven't (because of our current set up) set a target country for www.example.com (because of the lack of regional targeting and dynamic catch all) - would we be better leaving that untargeted and only specifying the regional targets for the new subfolder URLs (www.example.com/us/ /uk/ etc) or should we set the .com as the USA as default?
We'd be a bit weary of doing this because most of our traffic comes from the UK and South Africa, so I'm assuming it would be best to leave this alone unless someone else has a different opinion?
I know Googlebot almost always crawls .coms from US, which is why we were thinking of leaving the .com as the 'catch all' and specifying the US version.
3. Finally, we do have a lot pages which don't really change at all (like the about us page) would we give these any special directives to avoid duplicate content (as the content on these won't be changing at all?) or do we just keep the structure as shown above? I.E would the about us page (even though not changing) still be (with the canonical):
URL: https://www.example.com/about-us/ (x-default)
?
Thanks in advance!
-
If I understand the OP's intent, it is to target countries, not languages. Hreflang can specify alternates for a language, or a language-country combination, but unfortunately not just for a country. So, as the OP has proposed, yes you do need to specify the language and the country. And that does bring up a dilema faced by many of us in terms of what language to use. If your content is in all English, then yes you should use like "en-FR". BUT, you might also want to include an "fr-FR" as well, pointing to the same alternate URL. Because there are going to be a lot more France-based visitors on Google whose browser settings are for French language than English. For sure, both do exist (there are native English speakers in France too), but you don't have to choose one. You can include both. Google may not completely respect your directives since the content is in English (assuming that's the case), but it's what I would recommend. So, for each country (assuming the content is in English), include both an English and a language-specific hreflang tag (pointing to the same destination) for that country.
Since your last example uses "es-ES", I assume maybe that you're planning to also publish some content in Spanish language. But if not, again, realize you can include multiple hreflang tags for a single country, and pointing to the same page.
I also don't know where you are based. But if the business is US-based, I wouldn't duplicate US also as a localization. Rather, I would make that the default. Or, if you are based somewhere else, same thing, but with that country.
On question 2, you can set up a GSC property for folder paths (www.example.com/fr/), and target those. I would not target the root level (www.example.com) in your case, because that would also apply to all the subfolders. That's one of the advantages of using subdomains instead of subfolders, is that you can target each independently. But with subfolders, you can target all except the root (because it would cascade downward).
On question 3, you should do the same as you do in number 1, as long as you are duplicating those pages in each subfolder. Otherwise, if you don't give a directive of which page to index, since they are duplicates, Google is going to choose for you. And might not choose the one you prefer.
-
The language codes you are using in the above examples are not correct. The correct languages should be "en", "fr", "it", "es". If you want to specify the country code it must appear after the language code, more info here.
Keep in mind that the hreflang tags are not used by the web browsers to load the preferred language automatically, they are intended for search engines.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Setting up international site subdirectories in GSC as separate properties for better geotargeting?
My client has an international website with a subdirectory structure for each country and language version - eg. /en-US. At present, there is a single property set up for the domain in Google Search Console but there are currently various geotargeting issues I’m trying to correct with hreflang tags. My question is, is it still recommended practise and helpful to add each international subdirectory to Google Search Console as an individual property to help with correct language and region tagging? I know there used to be properly sets for this but haven’t found any up to date guidance on whether setting up all the different versions as their own properties might help with targeting. Many thanks in advance!
International SEO | | MMcCalden0 -
International SEO & Duplicate Content: ccTLD, hreflang, and relcanonical tags
Hi Everyone, I have a client that has two sites (example.com & example.co.uk) each have the same English content, but no hreflang or rel="canonical" tags in place. Would this be interpreted as duplicate content? They haven't changed the copy to speak to specific regions, but have tried targeting the UK with a ccTLD. I've taken a look at some other comparable question on MOZ like this post - > https://moz.com/community/q/international-hreflang-will-this-handle-duplicate-content where one of the answers says **"If no translation is happening within a geo-targeted site, HREFLANG is not necessary." **If hreflang tags are not necessary, then would I need rel="canonical" to avoid duplicate content? Thanks for taking the time to help a fellow SEO out.
International SEO | | ccox10 -
International Sites and Duplicate Content
Hello, I am working on a project where have some doubts regarding the structure of international sites and multi languages.Website is in the fashion industry. I think is a common problem for this industry. Website is translated in 5 languages and sell in 21 countries. As you can imagine this create a huge number of urls, so much that with ScreamingFrog I cant even complete the crawling. Perhaps the UK site is visible in all those versions http://www.MyDomain.com/en/GB/ http://www.MyDomain.com/it/GB/ http://www.MyDomain.com/fr/GB/ http://www.MyDomain.com/de/GB/ http://www.MyDomain.com/es/GB/ Obviously for SEO only the first version is important One other example, the French site is available in 5 languages and again... http://www.MyDomain.com/fr/FR/ http://www.MyDomain.com/en/FR/ http://www.MyDomain.com/it/FR/ http://www.MyDomain.com/de/FR/ http://www.MyDomain.com/es/FR/ And so on...this is creating 3 issues mainly: Endless crawling - with crawlers not focusing on most important pages Duplication of content Wrong GEO urls ranking in Google I have already implemented href lang but didn't noticed any improvements. Therefore my question is Should I exclude with "robots.txt" and "no index" the non appropriate targeting? Perhaps for UK leave crawable just English version i.e. http://www.MyDomain.com/en/GB/, for France just the French version http://www.MyDomain.com/fr/FR/ and so on What I would like to get doing this is to have the crawlers more focused on the important SEO pages, avoid content duplication and wrong urls rankings on local Google Please comment
International SEO | | guidoampollini0 -
International SEO question domain.com vs domain.com/us/ , domain.com/uk etc.
Hi Mozzers, I am expanding a website internationally. I own the .com for the domain. I need to accommodate multiple countries and I'm not sure if I should build a folder for /us/ for United States or just have the root domain .com OPTION 1:
International SEO | | jeremycabral
domain.com/page-url -- United States
domain.com/de/page-url -- Denmark
domain.com/jp/page-url -- Japan OPTION 2:
domain.com/us/page-url -- United States
domain.com/de/page-url -- Denmark
domain.com/jp/page-url -- Japan My concern with option 2 is there will be some dilution and we wouldn't get the full benefit of inbound links compared to Option 1 as we would have geo ip redirection in place to redirect users etc. to the relative sub-folder. Which option is better from an SEO perspective? Cheers, Jeremy0 -
International Sites - Sitemaps, Robots & Geolocating in WMT
Hi Guys, I have a site that has now been launched in the US having originally just been UK. In order to accommodate this, the website has been set-up using directories for each country. Example: domain.com/en-gb domain.com/en-us As the site was originally set-up for UK, the sitemap, robots file & Webmaster Tools account were added to the main domain. Example: domain.com/sitemap.xml domain.com/robots.txt The question is does this now need changing to make it specific for each country. Example: The sitemap and robots.txt for the UK would move to: domain.com/en-gb/sitemap.xml domain.com/en-gb/robots.txt and the US would have its own separate sitemap and robots.txt. Example : domain.com/en-us/sitemap.xml domain.com/en-us/robots.txt Also in order to Geolocate this in WMT would this need to be done for each directory version instead of the main domain? Currently the WMT account for the UK site is verified at www.domain.com, would this need reverifying at domain.com/en-gb? Any help would be appreciated! Thanks!
International SEO | | CarlWint0 -
Cross domain rel alternate, will it help or hurt?
I have a website that has similar pages on a US version and a UK version. Currently we want Uk traffic to go to the US, but the US domain is so strong it is outranking the UK in the UK. We want to try using rel alternate but have some concerns. Currently for some of our keywords US is #1, UK is #4. If we implement rel alternate, will it just remove our US page? We don't want to shoot ourselves in the foot and lose traffic. Is this worth doing, will it just remove our US ranking and our double listing? Any anecdotes, experiences or opinions are appreciated. Thanks.
International SEO | | MarloSchneider0 -
Will duplicate content across international domains have a negative affect on our SERP
Our corporate website www.tryten.com showcases/sells our products to all of our customers. We have Canadian and UK based customers and would like to duplicate our website onto .ca and .co.uk domains respectively to better service them. These sites will showcase the same products, only the price and ship from locations will change. Also, the phone numbers and contact info will be altered. The sites will all be on one server. On each of the sites there will be a country selector which will take you to the appropriate domain for the country selected. Will doing this negatively affect our rankings in the US, UK and Canada?
International SEO | | tryten0 -
What is the best SEO site structure for multi country targeting?
Hi There, We are an online retailer with four (and soon to be five) distinct geographic target markets (we have physical operations in both the UK and New Zealand). We currently target these markets like this: United Kingdom (www.natureshop.co.uk) New Zealand (www.natureshop.co.nz) Australia (www.natureshop.com/au) - using a google web master tools geo targeted folder United States (www.natureshop.com) - using google web master tools geo targeted domain Germany (www.natureshop.de) - in german and yet to be launched as full site We have various issues we want to address. The key one is this: our www.natureshop.co.uk website was adversely affected by the panda update on April 12. We had some external seo firms work on this site for us and unfortunately the links they gained for us were very low quality, from sometimes spammy sites and also "keyword" packed with very littlle anchor text variation. Our other websites (the .co.nz and .com) moved up after the updates so I can only assume our external seo consultants were responsible for this. I have since managed to get them to remove around 70% of these links and we have bought all seo efforts back in house again. I have also worked to improve the quality of our content on this site and I have 404'ed the six worst affected pages (the ones that had far too many single phrase anchor text links coming into them). We have however not budged much in our rankings (we have made some small gains but not a lot). Our other weakness's are not the fastest page load times and some "thin" content. We are on the cusp (around 4 weeks away) of deploying a brand new platform using asp.net MVP with N2 and this looks like it will address our page load speed issues. We also have been working hard on our content building and I believe we will address that as well with this release. Sorry for the long build up, however I felt some background was needed to get to my questions. My questions are: Do you think we are best to proceed with trying to get our www.natureshop.co.uk website out of the panda trap or should we consider deploying a new version of the site on www.natureshop.com/uk/ (geo targeted to the UK)? If we are to do this should we do the same for New Zealand and Germany and redirect the existing domains to the new geo targeted folders? If we do this should we redirect the natureshop.co.uk pages to the new www.natureshop.com/uk/ pages or will this simply pass on the panda "penalty". Will this model build stronger authority on the .com domain that benefit all of the geo targeted sub folders or does it not work this way? Finally can we deploy the same pages and content on the different geo targeted sub folders (with some subtle regional variations of spelling and language) or will this result in a duplicate content penalty? Thank you very much in advance to all of you and I apologise for the length and complexity of the question. Kind Regards
International SEO | | ConradC
Conrad Cranfield
Founder: Nature Shop Ltd0