Rel=alternate to help localize sites
-
I am wondering about the efficiency of the rel=alternate tag and how well it works at specifically localizing content.
Example:
I have a website on a few ccTLD's but for some reason my .com shows up on Google.co.uk before my .co.uk version of my page. Some people have mentioned using rel=alternate but in my research this only seems to be applicable for duplicate content in another language. If I am wrong here can somebody please help me better understand this application of the rel=alternate tag. All my research leads me to rel=alternate hreflang= and I am not sure that is what I want.
Thanks,
Chris Birkholm -
Not really so. In fact you need to use also the rel="canonical".
In order to not get you confused, I really suggest you to follow the implementations steps presented by Tim Grice in this post published on SEOWizz:
-
Gianluca,
Really appreciate the feedback here. So the one thing I have a question of then is how this rel=alternate tag would look on my .com as this is where I am apparently getting a little confused. I would basically list my other English versions there correct and the same for my co.uk, etc...?
-
Hi Thomas,
actually the rel="alternate" "hreflang" can be used also to define the region, apart the language.
That means that in your specific case you could use en-US (English version for USA users) - en-GB (English version for British users) - en-AU (English version for Australian users) and so on.
Then, in order to not have the .com site I suggest you to use the rel="alternate". More over, if the .com site is meant specifically for USA market, it would be also better to specify to Google that USA is the market in Google Webmaster Tools, because the contrary will mean that you are asking to it rank also in every regional Google, the .co.uk one too.
Others things that help search engines understand what site to present first in case like yours are the use of the local currency, address and phone numbers, but I don't know if it's your case. Also, your UK site maybe need a stronger link profile, especially rich in links by local authoritative sites.
Another "add on" is sometimes the use of the rel="canonical", but your doesn't seem the case where to use it
Finally, I give you a few links that could be helpful for you:
http://www.rimmkaufman.com/blog/advanced-international-seo-rel-alternate-hreflang-x/13122011/ << A post by Adam Audette, of which I suggest you read also the comments;
http://searchenginewatch.com/article/2137882/Newest-International-SEO-Challenge-Hreflang-Canonical-Tags << An interesting overview done in Search Engine Watch
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Massive local + national disconnect in rankings (local deindexed)
I asked the question originally on webmaster central. I tried RickRoll's solutions (but it doesn't seem to have solved the issue). Problem below: I've been noticing for some time that certain pages of our site (https://www.renthop.com/boston-ma/apartments-for-rent) have been deindexed locally (or very low ranked), but indexed nationally (well ranked). In fact, it seems that the actual page isn't ranking (but the blog https://www.renthop.com/blog is). This huge mismatch between national vs local rankings seem to only happen for Boston & Chicago. Other parts of the country seem unaffected (and the national & local rankings are very similar). A bit of a background (and my personal theory as to what's happening). We use to have subdomains: boston.renthop.com & chicago.renthop.com for the site. These subdomains stopped working, though, as we moved the site to the directory format (https://www.renthop.com/boston-ma/apartments-for-rent). These subdomain URLs were inactive / broken for roughly 4 months. After the 4 months, we did a 301 from the subdomain to the main page (because these subdomains had inbound external links). However, this seems to have caused the directory pages to exhibit the national/local mismatch effect instead of helping. Is there anything I'm doing wrong? I'm not sure if the mismatch is natural, if the pages are getting algo penalized on a local level (I'm negative SEOing myself), or if it's stuck in some weird state because of what happened with bad sub-domain move). Some things I've tried: I've created webmaster console (verified) accounts for both the subdomains. I've asked Google to crawl those links. I've done a 1-1 mapping between individual page on the old site vs the new directory format I've tried both doing a 301, 302 and meta-refresh redirect from the subdomains to the directory pages. I've made sure the robots.txt on the subdomain is working properly I've made sure that the robots.txt on the directory pages are working properly. See below for a screenshot of the mismatch & deindexing in local search results (this is using SERPS - but can be replicated with any location changer). Note the difference between the ranking (and the page) when the search is done nationally vs in the actual location (Boston, MA). I'd really appreciate any help.. I've been tearing my hair out trying to figure this out (as well as experimenting). renthop%2Bboston.png
Intermediate & Advanced SEO | | lzhou0 -
Bad site migration - what to do!
Hi Mozzers - I'm just looking at a site which has been damaged by a very poor site migration. Basically, the old URLs were 301'd to a page on the new website (not a 404) telling everyone the page no longer existed. They did not 301 old pages to equivalent new pages. So I just checked Google WMT and saw 1,000 crawl errors - basically the old URLs. This migration was done back in February, since when traffic to the website has never recovered. Should I fix this now? Is it worth implementing the correct 301s now, after such a timelapse?
Intermediate & Advanced SEO | | McTaggart0 -
Is it ok to add rel=CANONICAL into the desktop version on top of the rel="alternate" Tag (Mobile vs Desktop version)
Hi mozzers, We launched a mobile site a couples months ago following the parallel mobile structure with a URL:m.example.com The week later my moz crawl detected thousands of dups which I resolved by implementing canonical tags on the mobile version and rel=alternate onto the desktop version. The problem here is that I still also got Dups from that got generated by the CMS. ?device=mobile ?device=desktop One of the options to resolve those is to add canonicals on the desktop versions as well on top of the rel=alternate tag we just implemented. So my question here: is it dangerous to add rel=canonical and rel=alternate tags on the desktop version of the site or not? will it disrupt the rel=canonical on mobile? Thanks
Intermediate & Advanced SEO | | Ideas-Money-Art0 -
What next to help with my rankings
I'm after a fresh set of eyes and any suggestions to help me with my site on what next I should be doing to help increase rankings. The site is: http://bit.ly/VR6xIm Currently the site is ranking around 9-11th on google.co.uk for it's main term which is the name of the site. The site is around a year old, when it launched it went initially up towards positions 3-5 but has since settled at around where it is now. I have a free tool webmasters can use to implement our speed test into their sites which also includes a link back to our site in it to recognise that we are providing the tool for free, I periodically change the link achor text so it is not always the same anchor text that every site uses. Is there anything obvious I should be doing or that is missing that would help with my rankings? *Just as a note, I am not after a review on the actual speed test on the site, a new one will be developed to help further increase accuracy.
Intermediate & Advanced SEO | | Wardy0 -
How can I rank a national site for local terms
Hi All I have a website that covers all parts of the UK and I wish to be found for terms such as "car for sale London" "car for sale Manchester" and so on. In the past I have created separate landing pages for each town and city but with the quality score of a page becoming more of a ranking factor it is hard to make 300 + town pages interesting and useful. Is it best practice to do what I am doing and improve the quality of each of the pages or would I be better off removing the old pages and using some other technique to rank for the local searches? Thanks for your help
Intermediate & Advanced SEO | | MotoringSEO0 -
Please help me with your advice
Hi all, Couple years ago I started to build my business based on EMD domain. The intention was to create the source with the rich unique content. After a year of hard work the site achieved top 10 in Google and started to generate good amount of leads. Then Google announced the EMD Update and site lost the 90% of traffic (after Pandas updates our SERP was steady ) “ a new filter that tries to ensure that low-quality sites don’t rise high in Google’s search results simply because they have search terms in their domain names. ” But I don’t consider my site low-quality site, every page, every post is 100% unique and has been created only to share the knowledge with others… The site has EXCELLENT content from industry point of view.... Since the “ EMD Update “ I read hundreds , hundreds of different articles and opinions related to EMD update and finally I am confused and lost. What should I do… • Kill the site and start new one
Intermediate & Advanced SEO | | Webdeal
• Get more links, but what type of links and how I should get them
• Keep hoping and pray....
• Or do something else Please help me with your advice0 -
Magneto site with many pages
just finsihed scan to a magento site. off course I am getting thousand of pages that are dynamic. search pages and other. checking with site command on Google I see 154,000 results which pages it is recommended to block? some people are talking about blocking the search pages and some actually talking about allowing them? any answer on this? Thanks
Intermediate & Advanced SEO | | ciznerguy0 -
Should I be using rel canonical here?
I am reorganizing the data on my informational site in a drilldown menu. So, here's an example. One the home page are several different items. Let's say you clicked on "Back Problems". Then, you would get a menu that says: Disc problems, Pain relief, paralysis issues, see all back articles. Each of those pages will have a list of articles that suit. Some articles will appear on more than one page. Should I be worried about these pages being partially duplicates of each other? Should I use rel-canonical to make the root page for each section the one that is indexed. I'm thinking no, because I think it would be good to have all of these pages indexed. But then, that's why I'm asking!
Intermediate & Advanced SEO | | MarieHaynes0