Hreflang tag implentation
-
Hi,
We've had hreflang tags implemented on our site for a few weeks now, and while we are seeing some improvements for the regional subfolders I wanted to double check I had the tags implemented correctly (a couple of examples are below). However while the regional subfolder sites are now ranking instead of the US site for some keywords, some key search terms are still returning the US site. Could this be due to incorrect implementation for that specific page?
Due to complications with using Magento we're implementing the tags in the site maps. Also magento appears to be inserting a rel canonical tag automatically for each page and self referencing e.g. On www.example.com/uk/security-cameras (one of the pages we're having issues with) the canonical tag is http://www.example.com/uk/security-cameras" />. Is this an issue?
Any advice would be appreciated. Thanks.
<url><loc>http://www.example.com/uk/dvrs-kits</loc>
<lastmod>2014-07-23</lastmod>
<changefreq>daily</changefreq>
<priority>0.5</priority></url>
<url><loc>http://www.example.com/uk/dvrs-kits/1080p</loc>
<lastmod>2014-07-23</lastmod>
<changefreq>daily</changefreq>
<priority>0.5</priority></url> -
Thanks.. it took me a while but I did realise that one eventually.
-
Thanks Thomas. That is how our sitemap now reads. It just seems odd that when you access map via a browser the hreflang tags don't appear...
-
The answers you received here in the thread are correct:
your sitemap implementation is wrong, because it commits a classic mistake: self-referencing hreflang annotations are missing.
In other words, you must declare that the URL http://www.example.com/uk/dvrs-kits is the one Google must show to English speaking users in the UK.
So you must add this:
Seeing the changes taking place in the SERPs is not immediate.
-
So I will test the site map agin after you made the XML sitemap did you tell Google about language and country targeting? Use tell Google Webmaster tools to Geo target the subfolders? http://www.google.com/webmasters/ Check your listing all over the world http://www.isearchfrom.com/ https://support.google.com/webmasters/answer/2620865?hl=en&ref_topic=2370587 You need the site map to look like this. <url><loc>http://www.example.com/english/</loc> **This is the URL we want to be indexed** <xhtml:link rel="alternate" <strong="">This tells Google there is an alternate version of the URL hreflang="de" **This tells Google the language we’re targeting with the alternate version (German)** href="http://www.example.com/deutsch/" **This tells Google the URL of the alternate version** /> <xhtml:link rel="alternate" <strong="">This tells Google that there is an alternate version of the URL hreflang="de-ch" **This tells Google the language and country we’re targeting with the alternate version (German and Switzerland)** href="http://www.example.com/schweiz-deutsch/" **This tells Google the URL of the alternate version** /> <xhtml:link rel="alternate" <strong="">This tells Google that there is an alternate version of the URL hreflang="en" **This tells Google the language we’re targeting with the alternate version (English)** href="http://www.example.com/english/" **This tells Google the URL of the alternate version** /></xhtml:link></xhtml:link></xhtml:link></url>
Important note - the example above is only for one URL which has three alternate versions. The following code is an example of a full XML sitemap which includes three URLs, each with three alternate versions.
<urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9" xmlns:xhtml="http://www.w3.org/1999/xhtml"><url> <loc>http://www.example.com/english/</loc> </url> <url> <loc>http://www.example.com/deutsch/</loc> </url> <url> <loc>http://www.example.com/schweiz-deutsch/</loc> </url></urlset> I will use Deep Crawl to look for hreflang it site maps. You might want to use the header way if it is not working. Thomas
-
I've updated the site maps but the hreflang tags still aren't showing when you view the site map...
-
I think I might see the problem. I'm only referencing the alternative pages with the hreflang tag, I've not included the site itself. it should be:
<url><loc>http://www.swann.com/uk/dvrs-kits</loc>
<lastmod>2014-07-23</lastmod>
<changefreq>daily</changefreq>
<priority>0.5</priority></url> -
This is the beginning of my sitemap for the UK subfolder, as far as I can see we're following both the case studies you posted and the Google advice on the structure. But as you point out when you view the sitemap online it's not showing the hreflang tags.
<urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9" xmlns:xhtml="http://www.w3.org/1999/xhtml"><url><loc>http://www.swann.com/uk/dvrs-kits</loc>
<lastmod>2014-07-23</lastmod>
<changefreq>daily</changefreq>
<priority>0.5</priority></url>
<url><loc>http://www.swann.com/uk/dvrs-kits/1080p</loc>
<lastmod>2014-07-23</lastmod>
<changefreq>daily</changefreq>
<priority>0.5</priority></url>
<url><loc>http://www.swann.com/uk/dvrs-kits/960h</loc>
<lastmod>2014-07-23</lastmod>
<changefreq>daily</changefreq>
<priority>0.5</priority></url></urlset> -
-
- Here is the next test
- Use this link to see in the the site
- http://crawl.blueprintmarketing.com/projects/reports/69930?ro=991e4a1c6d5086bd0bd4d0965e3b6037ed69b692
- http://cl.ly/3G2h3u464400
- Files
- http://cl.ly/3G2h3u464400
- http://searchengineland.com/how-to-implement-the-hreflang-element-using-xml-sitemaps-123030
- https://search.nerdydata.com/search/#!/searchTerm=http://www.swann.com//searchPage=1/sort=pop
- sitemap
- http://cl.ly/text/1d2E2h3P022J
- http://cl.ly/text/0t0T3T2p210h
- https://cl.ly/Xjb3
-
Thanks for running this.
The deeptrawl says that there are no hreflang tags in place. If you view the sitemap.xml files on the site you can't see any there either. However they're definitely in the files I'm giving to my colleague to upload.
I matched the structure of the tag to the two case studies you put in your first post...
-
Here is crawl 1
- http://cl.ly/3u0Z3N0S3m1B
- Interactive URL
- http://crawl.blueprintmarketing.com/projects/reports/69929?ro=0de594f6c4b262001d6d234c282d8ade7e42c020
- http://crawl.blueprintmarketing.com/report_grid/trend/69929/pages_without_hreflang
- &
- http://crawl.blueprintmarketing.com/report_grid/69929?repname=pages_without_hreflang
- I am still running a 2ed full crawl it will post in the AM
Tom
-
You could use the site map however I've had better luck with other methods. If you would like me to I am more than happy to run deep crawl on your site and figure out the problem with your current setup via site map
I have Deepcrawl & will use it to check.
I am referencing this from
http://moz.com/blog/hreflang-behaviour-insights
"Section 4: Tools for the serious International SEO
Essentials:
- Reliable rank tracker that can localize: Advanced Web Ranking, Moz, etc...
- Crawler that can validate hreflang annotations in XML sitemaps or within : The only tool on the market that can do this, and does it very well, is Deepcrawl.
Other nice-to-haves:
- Your own method of "gathering" international search results on scale. You should probably go with proxies.
- Your own method of parsing XML sitemaps and cross checking (even if you use something like Deepcrawl, you'll need to double check).
- Obvious, but worth a reminder: Google webmaster tools, Analytics, access to server logs so you can understand Google's crawl behaviour."
Please look for the report it in the AM,
Tom
-
Hi,
I did have a read through those two case studies previously. My understanding was that we could implement the code purely on the on the sitemap, and that nothing on the site was needed. Are you saying we also need some code on the site as well?
The verification tool doesn't say we have it implemented, but I wasn't sure that it would check the sitemaps or just the sites HTML.
-
Run the site through the tools below
it appears that your only running the site map and not the code itself a good look at the code need it is found in URL below
- http://moz.com/blog/using-the-correct-hreflang-tag-a-new-generator-tool
- http://www.seerinteractive.com/blog/case-study-the-impact-of-hreflang-tag
1 validator
- Validator http://flang.dejanseo.com.au/
- 2 site map tool
- site map tool http://www.themediaflow.com/tool_hreflang.php
- http://www.stateofdigital.com/hreflang-sitemap-tool/
- hreflang
- Generator http://www.internationalseomap.com/hreflang-tags-generator/
- From Google
- https://support.google.com/webmasters/answer/189077?hl=en
For language/country selectors or auto-redirecting homepages, you should add an annotation for the hreflang value "x-default" as well:
Thomas
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate content: using the robots meta tag in conjunction with the canonical tag?
We have a WordPress instance on an Apache subdomain (let's say it's blog.website.com) alongside our main website, which is built in Angular. The tech team is using Akamai to do URL rewrites so that the blog posts appear under the main domain (website.com/more-keywords/here). However, due to the way they configured the WordPress install, they can't do a wildcard redirect under htaccess to force all the subdomain URLs to appear as subdirectories, so as you might have guessed, we're dealing with duplicate content issues. They could in theory do manual 301s for each blog post, but that's laborious and a real hassle given our IT structure (we're a financial services firm, so lots of bureaucracy and regulation). In addition, due to internal limitations (they seem mostly political in nature), a robots.txt file is out of the question. I'm thinking the next best alternative is the combined use of the robots meta tag (no index, follow) alongside the canonical tag to try to point the bot to the subdirectory URLs. I don't think this would be unethical use of either feature, but I'm trying to figure out if the two would conflict in some way? Or maybe there's a better approach with which we're unfamiliar or that we haven't considered?
Technical SEO | | prasadpathapati0 -
H1 Tags the same as Title Tags and other meta questions
I run an ecom store that has about 800 live products. When everything got set up, no one set up the title tags correctly. So I am going through to update them in bulk. What I was going to do was to take the product name (which serves as the H1 tag), use that with a postfix | CompanyName. If length is an issue I trim it down. But the question is, will having essentially duplicate information in here be an issue? Also, when someone was setting up meta descriptions, they often used basically the product name or a half sentence. Would it be better to remove the descriptions and allow google to decide? I even had some that were literally just the brand name of the product, which I already removed.
Technical SEO | | ShockoeCommerce0 -
Best practice to handle Wordpress Categories/Tags
Hello Mozzers, I am sure a lot of people here are using wordpress. How do you handle Categories & Tags? I came across that they produce a lot of duplicate content in the google index. My website is brand new so I don't have any traffic yet, how would you handle it? noindex, follow? Or block /categories/ and /tags/ from robots.txt? Probably I am completely wrong with both ways? I am grateful for your answers! Best regards!
Technical SEO | | grobro0 -
Whitespace INSIDE # tag harmful?
Hi, I understand that 'whitespace' in source code is absolutely fine and is stripped out. For example the following code is fine: Red Apples some text However, how is whitespace interpreted INSIDE html tags such as H1's? My Dev team have instructions to add H1's to a page, however they have done so like this: Red Apples (37 characters long) rather than this: Red Apples (10 characters long) Do you think this extra space will be harmful? The browser renders it fine, however if you use something like the mozbar plugin is shows the H1 length as 37 characters. I know the 10 character H1 is 100% relevant to the search term "Red Apples", however is the 37 character H1 only 27% relevant? (10/37) I've made the request to the Dev team to remove this whitespace because I'd rather err on the side of caution, but its been knocked back because the HTML spec specifies consecutive white-space should be interpreted as a single space and all browsers build the DOM by trimming a tags value - and they imagine search bots do the same so don't want to mess with the compiler. Anyone have experience of this? I've never had whitespace in a H1 before so don't know. Happy to leave the whitespace in if it's not going to be an issue. Thanks in advance
Technical SEO | | FashionLux0 -
I need help with a PHP canonical URL tags
I found a little difficult for me to do a canonical tag in my PHP. On-Page Report Card We check to make sure that IF you use canonical URL tags, it points to the right page. If the canonical tag points to a different URL, engines will not count this page as the reference resource and thus, it won't have an opportunity to rank. If you've not made this page the rel=canonical target, change the reference to this URL. NOTE: For pages not employing canonical URL tags, this factor does not apply. I don't know how to tidy my PHP Any suggestion.
Technical SEO | | lnietob0 -
One H1 tag Dead Long Live multiple H1 tags?
Good afternoon from 9 degrees C mostly cloudy Wetherby UK, Ive been holding on to the mantra of one h1 tag per page but a developer has challenged me on this by stating you can have multiple h1 tags on the condition the page is HTML 5 & each h1 tag is within its own section or article tag. So the question is do i need to change my tune? Thanks in advance, David
Technical SEO | | Nightwing0 -
I use All in one SEO pack for wordpress and i i have 2 meta tags i need to delete them is this the meta description tag ?
add_option("aiosp_post_meta_tags", '', 'All in One SEO Plugin Additional Post Meta Tags', 'yes'); add_option("aiosp_page_meta_tags", '', 'All in One SEO Plugin Additional Post Meta Tags', 'yes'); add_option("aiosp_home_meta_tags", '', 'All in One SEO Plugin Additional Home Meta Tags', 'yes'); add_option("aiosp_do_log", null, 'All in One SEO Plugin write log file', 'yes'); */
Technical SEO | | fhnhockey0880 -
Content loc and player log tags for XML video site maps
I need a little help understanding how to create two of the required tags for a XML video site map for Google. 1. video:content_loc2.<video:player_loc< p=""></video:player_loc<></video:content_loc> Google explains their Video XML Site map requirements here:
Technical SEO | | dsexton10
www.google.com/support/webmasters/bin/answer.py?answer=80472
Using the example on this Google Web Master Help page (where they explain all six of the required tags) , here are examples of the two tags I need help with: video:content_locwww.example.com/video123.flv</video:content_loc> <video:player_loc allow_embed="yes" autoplay="ap=1">www.example.com/videoplayer.swf?video=12...video:player_loc></video:player_loc> The video I am trying to optimize is located on a page on my site:
www.mountainbikingmaine.com/races/bradbury_hawk.html
This page has an embedded Vimeo video. So I don't have the video file on my domain. It is on Vimeo. Here is source code from my page that I think provides the information I need to create the two tags that Google requires. <iframe src="<a rel=" nofollow"="" href="http://player.vimeo.com/video/24580638?title=0&byline=0&portrait=0"" target="_blank">player.vimeo.com/video/24580638?title=0&...amp;portrait=0"</a> width="400" height="533" frameborder="0"></iframe> [vimeo.com/24580638">Bradbury](<a rel=) Mountain Maine Hawk Migration Count from [vimeo.com/user3219915">dan](<a rel=) sexton Using this source from my site, can you suggest what to put in the two tags? Thanks! Dan0