Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
International SEO Subfolders / user journey etc
-
Hi
According to all the resources i can find on Moz and elsewhere re int seo, say in the context of having duplicate versions of US & UK site, its best to have subfolders i.e.
&
however when it comes to the user journey and promoting web address seems a bit weird to say visit us at: domain.com/en-us/ !?
And what happens if someone just enters in domain.com from the US or UK ?
My client wants to use an IP sniffer but i've read thats bad practice and should employ above style country/language code instead, but i'm confused about both the user journey and experience in the case of multiple sub folders.
Any advice much appreciated ?
Cheers
Dan
-
Thanks for your comments but im looking directly into subfolder option (since TLD not an option and sub-domain considered bad practice from what i can gather after many days research on Moz etc
As a result this is what ill issue to a clients development team in this circumstance is as follows for where sites preferred structure is sub-folders/directories:
-
Implement IP sniffing on the home page ONLY
-
Then have Sub-Folders named after the official country abbreviations which will create a better user experience than both country and language i.e. domain.com/us/ as opposed to domain.com/en-us/ or domain.com/en-gb/ etc etc. This way it will only manipulate the homepage crawling and not site-wide indexing issues.
3) Target these folders to the correct countries in Google’s and Bing’s Webmasters Tools. Use the official country and language codes in the Hreflang mark-up as per point 4.
-
Set up site maps for each subfolder and rel="alternate" hreflang= according to Google guidelines. Here's a great tool to help with correct implementation: http://www.themediaflow.com/resources/tools/href-lang-tool/
-
Specify the content language/country by adding the 'country-language' meta-tags in the html head
6) Link between each country/language version in a crawl-able and visible manner (for SE and Users)
7) Create individual profiles in GWT & Bing Webmaster Tools for each country/language sub-folder and geo-target accordingly
Create individual profiles within GAnalytics for each country/language version and configure to track internal activity between different versions9) Localise content so has US currency, contact details, spelling etc
10) Other localisation techniques ( such as marking up contact details with schema places code)
Note RE: HrefLang & Canonicalisation:
An extra advantage of using hreflang is that it will provide a degree of canonicalisation. Should canonical tag be employed in the future never so across language versions if site expand into non English versions. More info here: http://www.youtube.com/watch?v=Igbrm1z_7Hk
-
-
Hey Dan,
The challenges with international sites are many and varied. The 'best' international strategy really depends on your resources

Here's how I see the advantages / disadvantages of each approach:
Subfolders - ranking may be 'easier' as domain authority is consolidated, but URLs are ugly
The sub-folder approach is often utilised where there's insufficient resource to market and maintain separate international ccTLDs (e.g. .co.uk, .com, .fr etc). The advantage with the subfolder approach is that you're consolidating domain authority - so the links to /en-uk/ (NB do use en-uk NOT en-gb incidentally) pass authority to /en-us/ and vice versa.
You're building one strong site, rather than trying to build two, three (or more) strong sites. However, as you've identified URLs get long and a bit ugly.
ccTLDs - Arguably nicer for users, but might not rank
Conversely, whilst ccTLDs (.co.uk, .com, .fr etc) are nicer from a user's perspective, you may struggle to rank if you're not able to spend sufficient time and resource on marketing and building links to these domains.
If you have the time and resources, I'd probably go down the ccTLD route, but if you don't, then the subfolder route is probably best.
IP redirects
In terms of the IP sniffers etc - be careful

Googlebot typical crawls from the US, and as such is likely to be redirected by your sniffer too. Essentially you're in danger of making any non-US versions invisible as far as Google are concerned. For that reason rather than doing a hard redirect I prefer Amazon's approach - if you visit Amazon.com from a UK IP you'll see a message which says: "Shopping from the UK? Visit Amazon.co.uk.".
That way users get the nudge to direct them to the right site and the bots can still crawl and index all of your content.
-
I want to start by saying I am not a user experience expert! I can tell you that from an SEO perspective, building international sites with subfolders can be advantageous because those international sites will inherit the main domain's authority and you can have one linking strategy that can benefit all areas of the site.
As for the user journey, I can provide some ideas for what we've done in the past with our clients. The first would be to have a window display on the main domain.com page that will allow a user to choose their country, and that will then forward them to the appropriate area of the site.
Another tactic we used was to purchase domain names that are unique for each country/language that would then redirect to the appropriate area of the site. We would typically only use these domains in offline marketing material (brochures, business cards etc..) and that way you can tell your prospective customers to visit you at domainuk.com instead of domain.com/en-gb/.
I hope this helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Advise on the right way to block country specific users but not block Googlebot - and not be seen to be cloaking. Help please!
Hi, I am working on the SEO of an online gaming platform - a platform that can only be accessed by people in certain countries, where the games and content are legally allowed.
International SEO | | MarkCanning
Example: The games are not allowed in the USA, but they are allowed in Canada. Present Situation:
Presently when a user from the USA visits the site they get directed to a restricted location page with the following message: RESTRICTED LOCATION
Due to licensing restrictions, we can't currently offer our services in your location. We're working hard to expand our reach, so stay tuned for updates! Because USA visitors are blocked Google which primarily (but not always) crawls from the USA is also blocked, so the company webpages are not being crawled and indexed. Objective / What we want to achieve: The website will have multiple region and language locations. Some of these will exist as standalone websites and others will exist as folders on the domain. Examples below:
domain.com/en-ca [English Canada]
domain.com/fr-ca [french Canada]
domain.com/es-mx [spanish mexico]
domain.com/pt-br [portugese brazil]
domain.co.in/hi [hindi India] If a user from USA or another restricted location tries to access our site they should not have access but should get a restricted access message.
However we still want google to be able to access, crawl and index our pages. Can i suggest how do we do this without getting done for cloaking etc? Would this approach be ok? (please see below) We continue to work as the present situation is presently doing, showing visitors from the USA a restricted message.
However rather than redirecting these visitors to a restricted location page, we just black out the page and show them a floating message as if it were a model window.
While Googlebot would be allowed to visit and crawl the website. I have also read that it would be good to put paywall schema on each webpage to let Google know that we are not cloaking and its a restricted paid page. All public pages are accessible but only if the visitor is from a location that is not restricted Any feedback and direction that can be given would be greatly appreciated as i am new to this angle of SEO. Sincere thanks,0 -
"Duplicate without user-selected canonical” - impact to SERPs
Hello, we are facing some issues on our project and we would like to get some advice. Scenario
International SEO | | Alex_Pisa
We run several websites (www.brandName.com, www.brandName.be, www.brandName.ch, etc..) all in French language . All sites have nearly the same content & structure, only minor text (some headings and phone numbers due to different countries are different). There are many good quality pages, but again they are the same over all domains. Goal
We want local domains (be, ch, fr, etc.) to appear in SERPs and also comply with Google policy of local language variants and/or canonical links. Current solution
Currently we don’t use canonicals, instead we use rel="alternate" hreflang="x-default": <link rel="alternate" hreflang="fr-BE" href="https://www.brandName.be/" /> <link rel="alternate" hreflang="fr-CA" href="https://www.brandName.ca/" /> <link rel="alternate" hreflang="fr-CH" href="https://www.brandName.ch/" /> <link rel="alternate" hreflang="fr-FR" href="https://www.brandName.fr/" /> <link rel="alternate" hreflang="fr-LU" href="https://www.brandName.lu/" /> <link rel="alternate" hreflang="x-default" href="https://www.brandName.com/" /> Issue
After Googlebot crawled the websites we see lot of “Duplicate without user-selected canonical” in Coverage/Excluded report (Google Search Console) for most domains. When we inspect some of those URLs we can see Google has decided that canonical URL points to (example): User-declared canonical: None
Google-selected canonical: …same page, but on a different domain Strange is that even those URLs are on Google and can be found in SERPs. Obviously Google doesn’t know what to make of it. We noticed many websites in the same scenario use a self-referencing approach which is not really “kosher” - we are afraid if we use the same approach we can get penalized by Google. Question: What do you suggest to fix the “Duplicate without user-selected canonical” in our scenario? Any suggestions/ideas appreciated, thanks. Regards.0 -
Has any one seen negative SEO effects from using Google Translate API
We have a site currently in development that is using the Google Translate API and I am having a massive issue getting screaming frog to crawl and all of our non-native English speaking employees have read through the translated copy in their native language and the general consensus is it reads at a 5th grade level at best. My questions to the community is, has anyone implemented this API on a site and has it a) helped with gaining traffic from other languages/countires and b) has it hurt there site from an SEO standpoint.
International SEO | | VERBInteractive0 -
Redirect the main site to keyword-rich subfolder / specific page for SEO
Hi, I have two questions. Question 1: is it worthwhile to redirect the main site to keyword-rich subfolder / specific page for SEO? For example, my company's webpage is www.example.com. Would it make sense to redirect (301) the main site to address www.example.com/service-one-in-certain-city ? I am asking this as I have learned that it is important for SEO to have keywords in the URL, and I was thinking that we could do this and include the most important keywords to the subfolder / specific URL. What are the pros and cons of this? Should I create folders or pages just the sake of keywords? Question 2: Most companies have their main URL shown as www.example.com when you access their domain. However, some multi-language sites show e.g. www.example.com/en or www.example.com/en/main when you type the domain to your web browser to access the site. I understand that this is a common practice to use subdomains or folders to separate different language versions. My question is regarding subfolders. Is it better to have only the subfolder shown (www.example.com/en) or should I also include the specific page's URL after the subfolder with keywords (www.example.com/en/main or www.example.com/en/service-one-in-certain-city)? I don't really understand why some companies show only the subfolder of a specific language page and some the page's URL after the subfolder. Thanks in advance, Sam
International SEO | | Awaraman1 -
Upper case or/and lower case in rel="alternate" hreflang
Hi Mozzers, I have a question about the rel="alternate" hreflang tag, with an example. When I use two subfolders for two different countries/languages, for instance www.domain.com/nl-nl/ and www.domain.com/nl-en/ (for the English version) and I want to use the rel="alternate" hreflang tag, do I need to follow the ISO standards concerning Uppercase country code and Lowercase language code (en-NL)? Or is it okay to use the Lowercase country and language code (en-nl), since we also use this in the URL of the Subfolder. What does Google prefer? Thanks in advance.
International SEO | | MartijnHoving820 -
SEO for .com vs. .com.au websites
I have a new client from Australia who has a website on a .com.au domain. He has the same domain name registered for .com. Example: exampledomain.com.au, and exampledomain.com He started with the .com.au site for a product he offers in Australia. He's bringing the same product to the U.S. (it's a medical device product) and wants us to build a site for it and point to the .com. Right now, he has what appears is the same site showing on the .com as on the .com.au. So both domains are pointing to the same host, but there are separate sections or directories within the hosting account for each website - and the content is exactly the same. Would this be viewed as duplicate content by Google? What's the best way to structure or build the new site on the .com to get the best SEO in the USA, maintain the .au version and not have the websites compete or be viewed as having duplicate content? Thanks, Greg
International SEO | | gregelwell0 -
Optimizing terms with accents/tildes in Spanish
Hello all, quick question. We are optimizing for a keyword that includes an accent in Spanish. Is it better to use the accented or regular form (i.e. inglés vs. ingles)? Also, is there any distinction between accents (áéí...) and the ene (ñ) in terms of strategy/best practices? Does this accent issue have a huge impact on ranking?
International SEO | | CuriosityMedia0 -
Google Webmaster Tools - International SEO Geo-Targeting site with Worldwide rankings
I have a client who already has rankings in the US & internationally. The site is broken down like this: url.com (main site with USA & International Rankings) url.com/de url.com/de-english url.com/ng url.com/au url.com/ch url.com/ch-french url.com/etc Each folder has it's own sitmap & relative content for it's respective country. I am reading in google webmaster tools > site config > settings, the option under 'Learn More': "If you don't want your site associated with any location, select Unlisted." If I want to keep my client's international rankings the way it currently is on url.com, do NOT geo target to United States? So I select unlisted, right? Would I use geo targeting on the url.com/de, url.com/de-english, url.com/ng, url.com/au and so on?
International SEO | | Francisco_Meza0