What is the proper way to setup hreflang tags on my English and Spanish site?
-
I have a full English website at http://www.example.com and I have a Spanish version of the website at http://spanish.example.com but only about half of the English pages were translated and exist on the Spanish site.
Should I just add a sitemap to both sites with hreflang tags that point to the correct version of the page?
Is this a proper way to set this up? I was going to repeat this same process for all of the applicable URLs that exist on both versions of the website (English and Spanish).
Is it okay to have hreflang="es" or do I need to have a country code attached as well? There are many Spanish speaking countries and I don't know if I need to list them all out. For example hreflang="es-bo" (Bolivia), hreflang="es-cl" (Chile), hreflang="es-co" (Columbia), etc...
Sitemap example for English website URL:
<url><loc>http://www.example.com/</loc></url>Sitemap example for Spanish website URL:
<url><loc>http://spanish.example.com/</loc></url>Thanks in advance for your feedback and help!
-
Sorry for viewing this just now... but - forgive me if I am wrong due to a bad understanding of the question - but I think Tom answer is not correct.
You are telling that your main site is in English, but that has also a Spanish subdomain with just half of it localized in Spanish.
If this is the correct interpretation of the origin of your doubts, than, in the Spanish subdomain the hreflang should be implemented so:
IN CASE OF SPANISH SUBDOMAIN URL WITH SPANISH CONTENT
<loc>http://www.example.com/</loc>
IN CASE OF SPANISH SUBDOMAIN URL WITH ENGLISH CONTENT
<loc>http://www.example.com/</loc>
Why? Because those "en" and "es" mean "English Language" and "Spanish Language", so you cannot declare as Spanish something that Spanish is not. As well you cannot declare both URLs as to shown to English speaking users, because that would create an hiccup to Google, who would not know what of the two it has to finally show to English speaking users.
More over, if you don't want to extend the use of the hreflang suggesting also the countries where to show some given URL, then you should canonicalize the spanish.domain.com URL with English content to the original www.domain.com URL.
The idea of using also the country code ISO could solve - somehow - this issue, because writing something like this:
<loc>http://www.example.com/</loc>
Then you will be telling Google to show the spanish.domain.com URL to the people using english in Spain (Google.es), and the English one to all the people speaking English in the rest of world.
Be aware, though, that Spanish people using Spanish will see in the www.domain.com URL in their Google.es SERPs, because the x-default is telling Google that all the people not using the language indicated in the hreflang="x-X" annotation (which is English), will have to see the main domain URL, and not the spanish subdomain one.
Hreflang is quite a sudoku, but it is extremely logic.
-
Thanks Tom for your input and feedback.
-
Hi,
To answer your first question, using hreflang tags in your sitemaps is a perfectly fine implementation of the tags, they will work whether they’re coded into the of each page, set in the sitemap or set in HTTP headers. This page will be useful for you as it explains all three methods quite well: http://www.branded3.com/blogs/implementing-hreflang-tag/
But when you add them to your sitemap you should include all variations of the page, along with a default – so if a French or German searcher accesses your site, you can define whether they’ll be served the Spanish or English page, like this:
<loc>http://www.example.com/</loc>
To answer your second question about countries, you are fine to use hreflang=”es” to define all Spanish traffic, but using country codes can be useful in some circumstances. For instance if you have a site talking about football, you could use hreflang=”en-us” for a page which refers to the game as ‘soccer’ and use hreflang=”en-gb” for the page calling it ‘football’.
This Google Webmaster support post explains using both quite well under ‘Supported language values’ which I recommend you take a look at as well: https://support.google.com/webmasters/answer/189077?hl=en
Hope that helps,
Tom
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Hreflang on non 1:1 websites
Hi. I have a client with international websites targeting several different countries. Currently, the US (.com) website outranks the country-specific domain when conducting a search within that country (i.e. US outranks the UK website in the UK). This sounds like a classic case for hrelang. However, the websites are largely not 1:1. They offer different content with a different design and a different URL structure. Each country is on a country-specific domain (.com, .co.uk, .com.au, etc.). As well, the country-specific domains have lower domain authority than the US/.com website - fewer links, lower quality content, poorer UX, etc. Would hreflang still help in this scenario if we were to map it the closest possible matching page? Do the websites not sharing content 1:1 add any risks? The client is worried the US/.com website will lose ranking in the country but the country-specific domain won't gain that ranking. Thanks for any help or examples you can offer!
International SEO | | Matthew_Edgar0 -
Splitting a site into 2 international sites
Hi all, I have a client that currently has a .com domain that ranks in both the US and the UK for various search terms. They have identified a need to provide different information for UK and US visitors which will require 2 versions of all pages. If we set up a .co.uk domain and keep the .com obviously that will be a brand new UK site which will have zero rankings. Any suggestions as to the best way to introduce this second version of the content without losing UK rankings? Thanks
International SEO | | danfrost0 -
Hreflang no return tags error in GWT
Hello everybody, It has been 2 month since I'm trying to figure out the cause of increasing "no return tags" error count in GWT. I have checked the syntax several times and even switched from meta tags method to including language versions in sitemap without any luck. Below is a screen shot of GWT error and a sitemap excerpt that shows original and alternate URL both having return tags pointing to each other. The full sitemap can be found here: http://wordsru.com/sitemap.xml Any help or insight about whats going on here much appreciated. Thanks! RKP6AhZ.jpg KFluNCC.jpg
International SEO | | Icemax0 -
If domain mapping subfolders to TLD's is it perceived as a fully separate entity/site therafter ?
Hi I take it once you have domain mapped a country specific subfolder to a country specific TLD (for better local region targeting reasons) Google perceives it as a completely separate entity and it no longer shares any of the parent sites domain benefits (such as domain authority etc) so from that point on requires its own dedicated link building etc ? All Best Dan
International SEO | | Dan-Lawrence0 -
International Sites - Sitemaps, Robots & Geolocating in WMT
Hi Guys, I have a site that has now been launched in the US having originally just been UK. In order to accommodate this, the website has been set-up using directories for each country. Example: domain.com/en-gb domain.com/en-us As the site was originally set-up for UK, the sitemap, robots file & Webmaster Tools account were added to the main domain. Example: domain.com/sitemap.xml domain.com/robots.txt The question is does this now need changing to make it specific for each country. Example: The sitemap and robots.txt for the UK would move to: domain.com/en-gb/sitemap.xml domain.com/en-gb/robots.txt and the US would have its own separate sitemap and robots.txt. Example : domain.com/en-us/sitemap.xml domain.com/en-us/robots.txt Also in order to Geolocate this in WMT would this need to be done for each directory version instead of the main domain? Currently the WMT account for the UK site is verified at www.domain.com, would this need reverifying at domain.com/en-gb? Any help would be appreciated! Thanks!
International SEO | | CarlWint0 -
Changing server location for a global targetted site
Hi, I am just in the process of purchasing a site from someone. The site has a global target audience (well global English speaking anyway). The site is on a .info domain and is currently hosted in Germany. Checking on SemRush it looks like 70% of traffic comes from English speaking countries (US, Australia, Canada, UK). Now I need to move the hosting to one of my own when I change ownership of the site. Now does it overly matter where I choose my hosting as currently it is hosted in Germany (around 4% of visitors from Germany) but I want to do my best not to knock any rankings but I was thinking of moving it to a UK or US based host but still want to keep a general worldwide userbase. As the US accounts for the largest part of traffic (39%) would I be best choosing hosting based over in the US or does it not overly matter too much (I am in the UK so most hosting I use is UK based). I have read a number of posts on server location but most seem to be for site which have a country specific target audience. Thanks for your help! 🙂
International SEO | | Wardy0 -
Have I over-optimized (on-site optimization using SEOMoz tool)?
Hey all, Quite new to SEO although I tried to educate myself as much as I could. I just spent (really) a lot of time doing the onsite optimization of a few key pages of a website in 3 languages (in which I'm more or less conversational - with the help of Google Translate). I know content should not be misleading and feel natural. I think the result is natural but I'm not sure... I optimized as much as I could so as to reach an "A" grade as per SEOMoz tool for each page, for 1-4 keywords per page. I feel sometimes I stretched a bit, but not sure what "stretching" is given my lack of experience. So I was wondering if some of you could tell me what they thought and if there was some obvious don'ts in my work. Here are a few key pages I have optimized: The homepage: http://goo.gl/00Fti The search results page: http://goo.gl/b1fxE The property page: http://goo.gl/t2GdY The destinations page: http://goo.gl/0Kc0l Note that the other versions of the page - Italian & Spanish - may be more awkward, so I welcome your opinions for these as well (dropdown on top of the page to change the language). Thanks!!
International SEO | | Philoups0 -
Site Spider/ Crawler/ Scraper Software
Short of coding up your own web crawler - does anyone know/ have any experience with a good bit of software to run through all the pages on a single domain? (And potentially on linked domains 1 hop away...) This could be either server or desktop based. Useful capabilities would include: Scraping (x-path parameters) of clicks from homepage (site architecture) http headers Multi threading Use of proxies Robots.txt compliance option csv output Anything else you can think of... Perhaps an oppourtunity for an additional SEOmoz tool here since they do it already! Cheers! Note:
International SEO | | AlexThomas
I've had a look at: Nutch
http://nutch.apache.org/ Heritrix
https://webarchive.jira.com/wiki/display/Heritrix/Heritrix Scrapy
http://doc.scrapy.org/en/latest/intro/overview.html Mozenda (does scraping but doesn't appear extensible..) Any experience/ preferences with these or others?0