Adding hreflang tags - better on each page, or the site map?
-
Hello,
I am wondering if there seems to be a preference for adding hreflang tags (from this article). My client just changed their site from gTLDs to ccTLDs, and a few sites have taken a pretty big traffic hit. One issue is definitely the amount of redirects to the page, but I am also going to work with the developer to add hreflang tags. My question is - is it better to add them to the header of each page, or the site map, or both, or something else? Any other thoughts are appreciated. Our Australia site, which was at least findable using Australia Google before this relaunch, is not showing up, even when you search the company name directly.
Thanks!Lauryn
-
Yes, your own second guess is the correct one.
The hreflang in URL based, not domain base, so you have to specify it for every single URL that needs it.
-
Thank you so much.
Does it suffice if we put this code in the header across the site, or does each unique url need to have a specialized url in the code.
Ex:
Is the following good for the entire site:
Vs.
AND
AND
Etc...up to 100+ pages....
-
First of all remember that the hreflang annotation is not necessarily needed in every page.
Said that, it really depends on your devs facilities what method to use, if in-code or using the sitemaps.Both work fine, and what you should not do is using both at the same time, because the possibility of creating contradictory hreflang annotations increases.
-
It depends on the setup of your site, to be honest.
If you have a Wordpress, Joomla, etc. with really easy access header sections that you can put the code in once and it's done forevermore no matter what pages are added, that's the simplest way.
If your dev can script it to add to each page through the sitemap, that's also a one & done way.
The only thing you really don't want to do is have to add a hreflang tag to every new page you add to the site. As long as you can avoid that, you should be right. We had a client add it to their sitemap but the sitemap wasn't auto-generating the tag so each time they updated they had to re-implement the tags. That was a frustrating time ... now we've got it automatically updating so it's much easier to maintain.
-
I think all the implementations work just about the same. We chose to do it in our sitemaps because that was the easiest for our developer to implement. You should choose one or the other, there's no need to do multiple implementations.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Home Page Disappears From Google - But Rest of Site Still Ranked
As title suggests we are running into a serious issue of the home page disapearing from Google search results whilst the rest of the site still remains. We search for it naturally cannot find a trace, then use a "site:" command in Google and still the home page does not come up. We go into web masters and inspect the home page and even Google states that the page is indexable. We then run the "Request Indexing" and the site comes back on Google. This is having a damaging affect and we would like to understand why this issue is happening. Please note this is not happening on just one of our sites but has happened to three which are all located on the same server. One of our brand which has the issue is: www.henweekends.co.uk
Intermediate & Advanced SEO | | JH_OffLimits0 -
Using hreflang for international pages - is this how you do it?
My client is trying to achieve a global presence in select countries, and then track traffic from their international pages in Google Analytics. The content for the international pages is pretty much the same as for USA pages, but the form and a few other details are different due to how product licensing has to be set up. I don’t want to risk losing ranking for existing USA pages due to issues like duplicate content etc. What is the best way to approach this? This is my first foray into this and I’ve been scanning the MOZ topics but a number of the conversations are going over my head,so suggestions will need to be pretty simple 🙂 Is it a case of adding hreflang code to each page and creating different URLs for tracking. For example:
Intermediate & Advanced SEO | | Caro-O
URL for USA: https://company.com/en-US/products/product-name/
URL for Canada: https://company.com/en-ca/products/product-name /
URL for German Language Content: https://company.com/de/products/product-name /
URL for rest of the world: https://company.com/en/products/product-name /1 -
B2B site targeting 20,000 companies with 20,000 dedicated "target company pages" on own website.
An energy company I'm working with has decided to target 20,000 odd companies on their own b2b website, by producing a new dedicated page per target company on their website - each page including unique copy and a sales proposition (20,000 odd new pages to optimize! Yikes!). I've never come across such an approach before... what might be the SEO pitfalls (other than that's a helluva number of pages to optimize!). Any thoughts would be very welcome.
Intermediate & Advanced SEO | | McTaggart0 -
Our client's web property recently switched over to secure pages (https) however there non secure pages (http) are still being indexed in Google. Should we request in GWMT to have the non secure pages deindexed?
Our client recently switched over to https via new SSL. They have also implemented rel canonicals for most of their internal webpages (that point to the https). However many of their non secure webpages are still being indexed by Google. We have access to their GWMT for both the secure and non secure pages.
Intermediate & Advanced SEO | | RosemaryB
Should we just let Google figure out what to do with the non secure pages? We would like to setup 301 redirects from the old non secure pages to the new secure pages, but were not sure if this is going to happen. We thought about requesting in GWMT for Google to remove the non secure pages. However we felt this was pretty drastic. Any recommendations would be much appreciated.0 -
Google's form for "Small sites that should rank better" | Any experiences or results?
Back in August of 2013 Google created a form that allowed people to submit small websites that "should be ranking better in Google". There is more info about it in this article http://www.seroundtable.com/google-small-site-survey-17295.html Has anybody used it? Any experiences or results you can share? *private message if you do not want to share publicly...
Intermediate & Advanced SEO | | GregB1230 -
How would you handle 12,000 "tag" pages on Wordpress site?
We have a Wordpress site where /tag/ pages were not set to "noindex" and they are driving 25% of site's traffic (roughly 100,000 visits year to date). We can't simply "noindex" them all now, or we'll lose a massive amount of traffic. We can't possibly write unique descriptions for all of them. We can't just do nothing or a Panda update will come by and ding us for duplicate content one day (surprised it hasn't already). What would you do?
Intermediate & Advanced SEO | | M_D_Golden_Peak1 -
Separate Site or should we incorporate it into our main site
Hello, We have a website to sell personal development trainings. The owners want to start 2 blogs - one for each owner - that promotes their personal coaching practices. What's the SEO advantages of embedding both blogs in the current site vs starting 2 brand new blogs with their names as the domain names?
Intermediate & Advanced SEO | | BobGW0 -
Migrating a site from a standalone site to a subdivision of large .gov.uk site
The scenario We’ve been asked by a client, a Non-Government Organisation who are being absorbed by a larger government ministry, for help with the SEO of their site. They will be going from a reasonably large standalone site to a small sub-directory on a high authority government site and they want some input on how best to maintain their rankings. They will be going from the Number 1 ranked site in their niche (current site domainRank 59) to being a sub directory on a domainRank 100 site). The current site will remain, but as a members only resource, behind a paywall. I’ve been checking to see the impact that it had on a related site, but that one has put a catch all 302 redirect on it’s pages so is losing the benefit of a it’s historical authority. My thoughts Robust 301 redirect set up to pass as much benefit as possible to the new pages. Focus on rewriting content to promote most effective keywords – would suggest testing of titles, meta descriptions etc but not sure how often they will be able to edit the new site. ‘We have moved’ messaging going out to webmasters of existing linking sites to try to encourage as much revision of linking as possible. Development of link-bait to try and get the new pages seen. Am I going about this the right way? Thanks in advance. Phil
Intermediate & Advanced SEO | | smrs-digital0