Correct site internationalization strategy
-
Hi,
I'm working on the internationalization of a large website; the company wants to reach around 100 countries. I read this Google doc: https://support.google.com/webmasters/answer/182192?hl=en in order to design the strategy.
The strategy is the following:
For each market, I'll define a domain or subdomain with the next settings:
- Leave the mysitename.com for the biggest market in which it has been working for years, and define the geographic target in Google search console.
- Reserve the ccTLD domains for other markets
- In the markets where I'm not able to reserve the ccTLD domains, I'll use subdomains for the .com site, for example us.mysitename.com, and I'll define in Google search console the geographic target for this domain.
Each domain will only be in the preferred language of each country (but the user will be able to change the language via cookies).
The content will be similar in all markets of the same language, for example, in the .co.uk and in .us the texts will be the same, but the product selections will be specific for each market.
Each URL will link to the same link in other countries via direct link and also via hreflang. The point of this is that all the link relevance that any of them gets, will be transmitted to all other sites.
My questions are:
- Do you think that there are any possible problems with this strategy?
- Is it possible that I'll have problems with duplicate content? (like I said before, all domains will be assigned to a specific geographic target)
- Each site will have around 2.000.000 of URLs. Do you think that this could generate problems? It's possible that only primary and other important locations will have URLs with high quality external links and a decent TrustRank.
- Any other consideration or related experience with a similar process will be very appreciated as well.
Sorry for all these questions, but I want to be really sure with this plan, since the company's growth is linked to this internationalization process.
Thanks in advance!
-
Thanks so much Gianluca, I'll take all your ideas into account.
-
You wrote this, and I'd like you to explain it better:
Each domain will only be in the preferred language of each country (but the user will be able to change the language via cookies).
Why people - for instance Italians - should be even feeling the need to switch the language from Italian to English?
Sincerely, I find it useless.
What you should do is doing like Amazon does: let people visit whatever version they want. For instance (I live in Spain), when I am in the UK and I want to buy something in Amazon, I visit amazon.es. Even if Amazon knows that I'm in the UK, and advices me that maybe I may prefer to shop in the .co.uk website, it lets me stay, navigate and buy from the .es one.
You, then, say this:
Each URL will link to the same link in other countries via direct link and also via hreflang. The point of this is that all the link relevance that any of them gets, will be transmitted to all other sites.
This is not that true. At least, not literally. In fact, the PageRank any page of yours will earn via internal and external links will just partly be passed to the other country versions corresponding pages. This because the PageRank flows through every link present in a page, both internal and external links, and "evaporates" in case of nofollow links.
About your questions:
- Do you think that there are any possible problems with this strategy?
Overall it is correct (being the only doubt the "cookie" thing you talked about)
Is it possible that I'll have problems with duplicate content? (like I said before, all domains will be assigned to a specific geographic target)
If you use the hreflang, you should not have issues related to duplicated content.
Each site will have around 2.000.000 of URLs. Do you think that this could generate problems? It's possible that only primary and other important locations will have URLs with high quality external links and a decent TrustRank.
Having millions of URLs should not be a problem... if it was so sites like Etsy, Home Depot or Amazon would be suffering it, wouldn't they? When it comes to Big Sites, the most important thing is having a very solid architecture and work very well everything internal linking.
Any other consideration or related experience with a similar process will be very appreciated as well.
When implementing the hreflang annotations, try not using as many hreflang as country versions are present.
In other words, apart the home page (for obvious localized brand visibility and for avoiding having, for instance, the .com version outranking the local one for being more authoritative), in the internal pages use only the hreflang annotation in order to suggest Google what version to show in case of countries sharing the same language.
For instance, let's take that www.dominio.com/page-a is in English and targeting the USA, then the hreflang annotation would be only relative to all the others URLs of pages in English and targeting others English speaking countries, but you should not add the annotation for the spanish speaking versions or italian.
Why? Because the languages are different and such a strong signal that you don't need to explain to Google that it should present to Spanish speaking users in Spain the URL of the spanish country version instead of the American English one.
-
Thanks Dmitrii.
Any other opinions will be appreciated aswell, this process is really important for this webpage.
-
Hi there.
Everything seems good to me. Just make sure that you use proper hreflangs or canonicals for content, which can potentially be duplicate, make sure that you have proper/correct sitemap and there are no problems with crawlability and accessability.
Good luck
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Correct Hreflang & Canonical Tags for Multi-Regional Website English Language Only having URL Parameters
Dear friends, We have a multi-regional website in English language only having the country selector on the top of each page and it adds countrycode parameters on each url. Website is built in Magento 1.8 and having 1 store with multiple store views. There is no default store set in Magento as I discussed with developer. Content is same for all the countries and only currency is changed. In navigation there are urls without url parameters but when we change store from any page it add parameters in the url for same page hence there are total 7 URLs. 6 URLs for each page (with country parameters) and 1 master url (without parameters) and making content duplicity. We have implemented hreflang tags on each page with url parameters but for canonical we have implemented master page url as per navigation without url parameters Example on this page. I think this is correct for master page but we should use URL parameters in canonical tags for each counry url too and there should be only 1 canonical tag on each country page url. Currently all the country urls are having master page canoncial tag as per the example. Please correct me if I am wrong and **in this case what has to be done for master page? **as google is indexing the pages without parameters too. We are also using GEOIP redirection for each store with country IP detection and for rest of the countries which are not listed on the website we are redirecting to USA store. Earlier it was 301 but we changed it to 302. Hreflang tags are showing errors in SEMRush due to redirection but in GWT it's OK for some pages it's showing no return tags only. Should I use **x-default tags for hreflang and country selector only on home page like this or should I remove the redirection? **However some of the website like this using redirection but header check tool doesn't show the redirection for this and for our website it shows 302 redirection. Sorry for the long post but looking for your support, please.
International SEO | | spjain810 -
What's the Best Strategy for Multiregional Targeting for Single Language?
I have a service based client who is based in the US but wants to expand to audiences in Australia, Canada, and the United Kingdom. Currently, all the content is in American English with international targeting in Google Search Console set to the US. I know that is going to have to change, but I'm unsure of the best strategy. Right now there are a few basic strategies in my head. Remove International Targeting in GSC and let her rip Remove International Targeting in GSC, install copies of the site on subfolders /au/, /ca/, and /uk/, add hreflang tags, and add canonicals pointing back to original Remove International Targeting in GSC, install copies of the site on subfolders /au/, /ca/, and /uk/, add hreflang tags, and risk duplicate content Have independent writers overcharge for English translations into different dialects and add hreflang tags It's hard to come up with a perfect solution for content differentiation by region in order to implement hreflang tags with a region (en-au, en-ca, en-gb). Remove International Targeting in GSC and let her rip This one is pretty simple. However, I am completely unsure of its effectiveness. Remove International Targeting in GSC, install copies of the site on subfolders /au/, /ca/, and /uk/, add hreflang tags, and add canonicals pointing back to original The point of adding canonicals is to avoid the duplicate content, but then my new subfolders do not get indexed. I'm unsure of what type of exposure these URLs would receive or how they would be valuable. Remove International Targeting in GSC, install copies of the site on subfolders /au/, /ca/, and /uk/, add hreflang tags, and risk duplicate content High risk of a penalty with duplicate content, but my targeting will be the most efficient. Have independent writers overcharge for English translations into different dialects and add hreflang tags This is probably the safest bet, takes the longest, and costs the most money. However, how different will the content actually be if I change truck to lorry, trunk to boot, and optimization to optimisation? Maybe I'm missing something, but this conundrum seems extremely difficult. Weighing the cost, time, and possible result is challenging. Hit me with your best answer and thanks for taking a look at someone else's problem.
International SEO | | ccox12 -
International Sites and Duplicate Content
Hello, I am working on a project where have some doubts regarding the structure of international sites and multi languages.Website is in the fashion industry. I think is a common problem for this industry. Website is translated in 5 languages and sell in 21 countries. As you can imagine this create a huge number of urls, so much that with ScreamingFrog I cant even complete the crawling. Perhaps the UK site is visible in all those versions http://www.MyDomain.com/en/GB/ http://www.MyDomain.com/it/GB/ http://www.MyDomain.com/fr/GB/ http://www.MyDomain.com/de/GB/ http://www.MyDomain.com/es/GB/ Obviously for SEO only the first version is important One other example, the French site is available in 5 languages and again... http://www.MyDomain.com/fr/FR/ http://www.MyDomain.com/en/FR/ http://www.MyDomain.com/it/FR/ http://www.MyDomain.com/de/FR/ http://www.MyDomain.com/es/FR/ And so on...this is creating 3 issues mainly: Endless crawling - with crawlers not focusing on most important pages Duplication of content Wrong GEO urls ranking in Google I have already implemented href lang but didn't noticed any improvements. Therefore my question is Should I exclude with "robots.txt" and "no index" the non appropriate targeting? Perhaps for UK leave crawable just English version i.e. http://www.MyDomain.com/en/GB/, for France just the French version http://www.MyDomain.com/fr/FR/ and so on What I would like to get doing this is to have the crawlers more focused on the important SEO pages, avoid content duplication and wrong urls rankings on local Google Please comment
International SEO | | guidoampollini0 -
Why has there been Massive increase in traffic to my clients .eu site after redirects were initiated?
Hi guys, This is a strange one thats really bugging me. I have a client that redirected their domain to a brand new domain that was already live for the previous two months. I have been trying analyse the data however I can't quite understand why there is a massive increase in visitors from the United States when the old site was redirected. The redirection took place at the beginning of July. It was badly managed in terms of the mapping of 301 redirects however thats not the issue here. The level of traffic is gradually decreasing I imagine due to the high level of bounces. The site in question is an EU funded website for education. The old site in the first 2 weeks of June received around 500 visits from the USA while the new site in the first 2 weeks of July (2 weeks into the redirects) received around 3,000 visits from the USA. The new site had previously received only 300 visits for the same period as the old site in the 1st 2 weeks of June. Any idea why this might be? Thanks Rob
International SEO | | daracreative0 -
I have on site translated into several languages on different TLDs, .com, .de, .co.uk, .no, etc. Is this duplicate content?
Three of the sites are English (.co.uk, .com, .us) as well as foreign (.de, .no, etc.) - are these all seen as having duplicate content on every site? They're hosted under the same EpiServer backend system if this helps. But I am still copying and pasting content over each site, and translating where necessary, so I'm concerned this is indexed as being large amounts of duplicate content. Site traffic doesn't appear to be suffering but as I'm currently putting together new SEOs strategies, I want to cover this possibility. Any advice on ensuring the sites aren't penalised appreciated!
International SEO | | hurtigruten0 -
SEO Audit "Hybrid Site"
Hi everyone! I'm trying to analyze a website which is regional in scope. The way the site for every market has been build out is like this : http://subdomain.rootdomain.com/market | http://asiapacific.thisismybrandname.com/ph OR http://subdomain.rootdomain.com/language | http://asiapacific.thisismybrandname.com/en Since this is the first time I'm trying to work on these kinds of sites, I would want to ask for any guidance / tips on how to do about SEO site and technical audit. FYI, the owner of the sites is not giving me access / data to their webmaster account nor their analytics tracking tool. Thanks everyone! Steve
International SEO | | sjcbayona-412180 -
What is the best SEO site structure for multi country targeting?
Hi There, We are an online retailer with four (and soon to be five) distinct geographic target markets (we have physical operations in both the UK and New Zealand). We currently target these markets like this: United Kingdom (www.natureshop.co.uk) New Zealand (www.natureshop.co.nz) Australia (www.natureshop.com/au) - using a google web master tools geo targeted folder United States (www.natureshop.com) - using google web master tools geo targeted domain Germany (www.natureshop.de) - in german and yet to be launched as full site We have various issues we want to address. The key one is this: our www.natureshop.co.uk website was adversely affected by the panda update on April 12. We had some external seo firms work on this site for us and unfortunately the links they gained for us were very low quality, from sometimes spammy sites and also "keyword" packed with very littlle anchor text variation. Our other websites (the .co.nz and .com) moved up after the updates so I can only assume our external seo consultants were responsible for this. I have since managed to get them to remove around 70% of these links and we have bought all seo efforts back in house again. I have also worked to improve the quality of our content on this site and I have 404'ed the six worst affected pages (the ones that had far too many single phrase anchor text links coming into them). We have however not budged much in our rankings (we have made some small gains but not a lot). Our other weakness's are not the fastest page load times and some "thin" content. We are on the cusp (around 4 weeks away) of deploying a brand new platform using asp.net MVP with N2 and this looks like it will address our page load speed issues. We also have been working hard on our content building and I believe we will address that as well with this release. Sorry for the long build up, however I felt some background was needed to get to my questions. My questions are: Do you think we are best to proceed with trying to get our www.natureshop.co.uk website out of the panda trap or should we consider deploying a new version of the site on www.natureshop.com/uk/ (geo targeted to the UK)? If we are to do this should we do the same for New Zealand and Germany and redirect the existing domains to the new geo targeted folders? If we do this should we redirect the natureshop.co.uk pages to the new www.natureshop.com/uk/ pages or will this simply pass on the panda "penalty". Will this model build stronger authority on the .com domain that benefit all of the geo targeted sub folders or does it not work this way? Finally can we deploy the same pages and content on the different geo targeted sub folders (with some subtle regional variations of spelling and language) or will this result in a duplicate content penalty? Thank you very much in advance to all of you and I apologise for the length and complexity of the question. Kind Regards
International SEO | | ConradC
Conrad Cranfield
Founder: Nature Shop Ltd0 -
Site structure for multi-lingual hotel website (subfolder names)
Hi there superMozers! I´ve read a quite a few questions about multi-lingual sites but none answered my doubt / idea, so here it is: I´m re-designing an old website for a hotel in 4 different languages which are all** hosted on the same .com domain** as follows: example.com/english/ for english example.com/espanol/ for **spanish ** example.com/francais/ for french example.com/portugues/ for portuguese While doing keyword search, I have noticed that many travel agencies separate geographical areas by folders, therefor an **agency pomoting beach hotels in South America **will have a structure as follows: travelagency.com/argentina-beach-hotels/ travelagency.com/peru-beach-hotels/ and they list hotels in each folder, therefor benefiting from those keywords to rank ahead of many independent hotels sites from those areas. What **I would like to **do -rather than just naming those folders with the traditional /en/ for english or /fr/ for french etc- is take advantage of this extra language subfolder to_´include´_ important keywords in the name of the subfolders in the following way (supposing the we have a beach hotel in Argentina): example.com/argentina-beach-hotel/ for english example.com/hotel-playa-argentina/ for **spanish ** example.com/hotel-plage-argentine/ for french example.com/hotel-praia-argentina/ for portuguese Note that the same keywords are used in the name of the folder, but translated into the language the subfolders are. In order to make things clear for the search engines I would specify the language in the html for each page. My doubt is whether google or other search engines may consider this as ´stuffing´ although most travel agencies do it in their site structure. Any Mozers have experience with this, any idea on how search engines may react, or if they could penalise the site? Thanks in advance!
International SEO | | underground0