Hreflang : mixing with/without country code for same language
-
Hello,
I would like to display 3 different english versions of my website : 1 for UK, 1 for CA and 1 for other english users.
It would look like this for a page:
. (english content with £ prices)
<link rel="alternate" href="https: xxx.com="" en-ca" hreflang="en-CA">(english content with $CA prices)</link rel="alternate" href="https:>
<link rel="alternate" href="https: xxx.com="" en="" " hreflang="en">(english content without currency)</link rel="alternate" href="https:>
I wonder if I can mix this hreflang without country code with hreflangs with country code for the 2 other specific versions... or if the hreflang without country code version will appear whatever the country, even if i specified it .
In other terms, is hreflang="en" > hreflang="en-CA" + hreflang="en-GB" if tagged together on a same page?
Thank you
-
I think you are taking that rather too literally.
For example, as I said the .com could be the one targeted with an hreflang="x-default. A person in the UK would, by definition be served with the .com/uk version.
You wouldn't put a hreflang="x-default on the /uk homepage.
Regards
Nigel
-
The x-default is just what the link you provided says it is:
From Google: The reserved value hreflang="x-default" is used when no other language/region matches the user's browser setting. This value is optional, but recommended, as a way for you to control the page when no languages match. A good use is to target your site's homepage where there is a clickable map that enables the user to select their country.
If you use it for just one language, the issue comes when you have more than one language. The setup for x-default is for when there is no language detected, not that a general, non-regional language is detected.
-
Surely the x-default is, as the tag suggests, a default where no country or language is targeted? So if someone resided in an untargeted country and the site happened to rank it would be that one that came up.
Someone in the UK (which contained a UK target tag) would not go to default first, as you suggest, and then select their own country & language. That's misleading.
I agree that the subfolders would be used to target each country but you would still need both country and language. With Canada you may wish to target en and fr as both are relevant and each would reside in a different sub-folder.
The language is essential imho.
Regards Nigel
-
Actually, the x-default is meant to be for a page that allows users to select a country/language combination.
Alexis, in theory, what you are proposing should work. However, it is not always perfect. There is so much that goes into how Google serves content to each user. You might not see it working perfectly every time, but you can use the non-country with two country-specific hreflang tags together.
In fact, the country coded hreflang tags were meant to be dialect-specific. So a site could have US English content and UK English content, but also more general English content for the rest of the English speaking people.
In fact, it sounds like if the only thing changing is the currency, you might try geo-targeting subfolders. You can do hreflang in addition to that, but geotargeting is what is meant to be used here.
- Content for CA: https://www.domain.com/ca/content
- Content for GB: https://www.domain.com/gb/content
- General Content: https://www.domain.com/content
Claim the subfolders in Google Search Console as different properties and then target each one to those countries in the International Targeting area.
Then add hreflang the way you mentioned with those URLs. However, this setup won't work if you are doing things with another language mixed in. If you are planning on that, let me know.
-
Hi Alexis
If the third one is the default then you need a default hreflang tag.
https://moz.com/learn/seo/hreflang-tag
So the last one would have this tag pointing to it:
More on Google here:
https://support.google.com/webmasters/answer/189077?hl=en
It will then become the default site for all people not in England or Canada. Google will not see any of them as duplicate content.
Regards
Nigel
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Few pages without SSL
Hi, A website is not fully secured with a SSL certificate.
Intermediate & Advanced SEO | | AdenaSEO
Approx 97% of the pages on the website are secured. A few pages are unfortunately not secured with a SSL certificate, because otherwise some functions on those pages do not work. It's a website where you can play online games. These games do not work with an SSL connection. Is there anything we have to consider or optimize?
Because, for example when we click on the secure lock icon in the browser, the following notice.
Your connection to this site is not fully secured Can this harm the Google ranking? Regards,
Tom1 -
Multiple times same keyword or Lsi / synonym.
Hello, I have a page with multiple bike tours on tour and under my image as anchor text linking to the different destination I have written the region + bike tour. Is it ok to write bike tour that many times bike tours or would it be better to write variations of it such as "Bordeaux biking, Strasbourg to Colmar by bike for (Alsace bike tour) or doesn't it matter ? Thank you,
Intermediate & Advanced SEO | | seoanalytics1 -
Hreflang and paginated page
Hi, I can not seem to find good documentation about the use of hreflang and paginated page when using rel=next , rel=prev
Intermediate & Advanced SEO | | TjeerdvZ
Does any know where to find decent documentatio?, I could only find documentation about pagination and hreflang when using canonicals on the paginated page. I have doubts on what is the best option: The way tripadvisor does it:
http://www.tripadvisor.nl/Hotels-g187139-oa390-Corsica-Hotels.html
Each paginated page is referring to it's hreflang paginated page, for example: So should the hreflang refer to the pagined specific page or should it refer to the "1st" page? in this case:
http://www.tripadvisor.nl/Hotels-g187139-Corsica-Hotels.html Looking foward to your suggestions.0 -
Generating Rich Snippets without Structured Data
I noticed something in Google search results today that I can't explain. Any help would be appreciated. I performed a real estate based search and the top result featured a rich snippet showcasing the following... Address Price Bd/Ba
Intermediate & Advanced SEO | | RyanOD
912 Garden District Dr #17. Charlotte, NC 28202 $179,990 3 / 2
222 S Caldwell St #1602. Charlotte, NC 28202 $389,238 2 / 2&1/2 However, when I visit the page associated with this information, there is no Schema to be found. In fact, the page is, for the most part, just a large table listing homes on the market. The table headings are Address, Price, and Bd/Ba. Is it common for Google to use table based data to generate rich snippets? What is the best way to influence this? In the absence of Schema (as the page we are talking about has no Schema implementation), does Google default to table data? Has anyone seen this behavior before and, if so, can you point me to it? EDIT: I've now come across a few other examples where the information is not in a table, but rather in divs. Why are such sites (you can find some by searching for "[ZIPCODE] real estate") getting this treatment?0 -
Duplicate Page Content / Titles Help
Hi guys, My SEOmoz crawl diagnostics throw up thousands of Dup Page Content / Title errors which are mostly from the forum attached to my website. In-particular it's the forum user's profiles that are causing the issue, below is a sample of the URLs that are being penalised: http://www.mywebsite.com/subfolder/myforum/pop_profile.asp?mode=display&id=1308 I thought that by adding - http://www.mywebsite.com/subfolder/myforum/pop_profile.asp to my robots.txt file under 'Ignore' would cause the bots to overlook the thousands of profile pages but the latest SEOmoz crawl still picks them up. My question is, how can I get the bots to ignore these profile pages (they don't contain any useful content) and how much will this be affecting my rankings (bearing in mind I have thousands of errors for dup content and dup page titles). Thanks guys Gareth
Intermediate & Advanced SEO | | gaz33420 -
Duplicate Content http://www.website.com and http://website.com
I'm getting duplicate content warnings for my site because the same pages are getting crawled twice? Once with http://www.website.com and once with http://website.com. I'm assuming this is a .htaccess problem so I'll post what mine looks like. I think installing WordPress in the root domain changed some of the settings I had before. My main site is primarily in HTML with a blog at http://www.website.com/blog/post-name BEGIN WordPress <ifmodule mod_rewrite.c="">RewriteEngine On
Intermediate & Advanced SEO | | thirdseo
RewriteBase /
RewriteRule ^index.php$ - [L]
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteRule . /index.php [L]</ifmodule> END WordPress0 -
SEO Correlation Between Code and Search Engine Rankings
I posted this on my blog and wanted to get everyones opinion on this (http://palatnikfactor.com/2011/06/07/seo-correlation-between-code-and-search-engine-rankings/) I’m always looking to see what top ranking websites may be doing to get the rankings they do. One of the tasks of any SEO I guess is to really analyze competitors, right? I want to really stress that what I am writing here is completely opinion based and have not (due to time) validated this correlation enough but would like to get the discussion started. Nevertheless, I did enough research to see that there may be a correlation between code validation and top ranking websites, at least for certain queries where the number of real big players/brands is limited or non-existent. So, what do I mean? http://validator.w3.org/ validates code on websites. This tool shows you errors and warnings that may be making it harder for search engines to crawl your website. Looking at top competitors for certain niches, I was surprised to find that top sites had very few errors compared to 2+ page rankings. That’s not to say that all the sites on the first page had fewer errors (cleaner code) than websites in the 2<sup>nd</sup> page plus. However, again, top ranking websites for keywords that I was looking at had cleaner code which may have a correlation in regards to organic rankings. What’s your take? Does this have any effect in regards to SEO?
Intermediate & Advanced SEO | | PaulDylan0 -
Redirects/Forwarding
I have two niche e-commerce sites. One is a PR3 with 3K pages indexed, the other is PR0 with 5K pages indexed. Each site has a blog that has been updated regularly. They both rank well for some fairly competitive keywords and some good links pointing to them. I also have a main site that is PR3. I am thinking of closing down the sites because they are not generating enough revenue, here are my questions: What is the best way to get the most SEO value from these sites? Do I just do a redirect to the main site? Should I keep the sites and use canonical URLs to the main site? Should I keep the domain as a wordpress blog and point links to the main site? What should I do with the blogs? They are on sub-domains, neither has pagerank. Thanks
Intermediate & Advanced SEO | | inhouseseo0