Hreflang/Canonical Inquiry for Website with 29 different languages
-
Hello,
So I have a website (www.example.com) that has 29 subdomains (es.example.com, vi.example.com, it.example.com, etc).
Each subdomain has the exact same content for each page, completely translated in its respective language.
I currently do not have any hreflang/canonical tags set up.
I was recently told that this (below) is the correct way to set these tags up
-For each subdomain (es.example.com/blah-blah for this example), I need to place the hreflang tag pointing to the page the subdomain is on (es.example.com/blah-blah), in addition to every other 28 subdomains that have that page (it.example.com/blah-blah, etc). In addition, I need to place a canonical tag pointing to the main www. version of the website. So I would have 29 hreflang tags, plus a canonical tag.
When I brought this to a friends attention, he said that placing the canonical tag to the main www. version would cause the subdomains to drop out of the SERPs in their respective country search engines, which I obviously wouldn't want to do.
I've tried to read articles about this, but I end up always hitting a wall and further confusing myself. Can anyone help? Thanks!
-
_For each subdomain (es.example.com/blah-blah for this example), I need to place the hreflang tag pointing to the page the subdomain is on (es.example.com/blah-blah), in addition to every other 28 subdomains that have that page (it.example.com/blah-blah, etc). In addition, I need to place a canonical tag pointing to the main www. version of the website. So I would have 29 hreflang tags, plus a canonical tag. _
Everything correct but the canonical part (but maybe I misunderstood what you wrote).
If the different country targeting pages are in different languages, then you don't have to point the rel="canonical" to the main www. version. NOT AT ALL, because they are not identical. You will start seeing the search snippets of the URLs of those geo-targeted versions (shown because of the hreflang) using the title tag and meta description of the www. version page. So, for instance, the search snippet of the Italian version having the Italian URL but everything else in English. If you need to use the rel="canonical" it should be self-referential (if not another in same cases, but of the same subdomain)
-
Hi,
Probably the easiest solution in your case is to use the geo-targeting settings in Google Webmaster tools (but only if each of your subdomains is targeting a specific country - not a specific language).
If you want to use hreflang - there is quite a good post on it on Moz (http://moz.com/blog/hreflang-behaviour-insights) - must admit I personally never used it.
rgds,
Dirk
-
If your translations are automated, Google requests that you don't index them, but it sounds like you've created fully translated, static pages. Here's Google's info on that, "Q: Can I use automated translations?
A: Yes, but they must be blocked from indexing with the “noindex” robots meta tag. We consider automated translations to be auto-generated content, so allowing them to be indexed would be a violation of our Webmaster Guidelines." Maybe this is where someone had confusion... Anyways, here's their larger FAQ on it: https://sites.google.com/site/webmasterhelpforum/en/faq-internationalisation. Fully done translations are considered canonical within their own languages, so no need to point to the www version as canonical.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to Increase US traffic for other GEO based website
Can you help me to understand why my traffic from the US is not increasing for landing pages/product pages whereas for our blog it has grown 2X to 3X past 2 quarters? I am afraid that I don't get any right answer for this. Could you or someone help me to discover the answer? Also, what should I do to rank in the US for a particular KW, we rank on 2 positions for KW "Hackathon" in India but for the US the no is 56. I don't understand what to do and the best possible way to rank in the US.
White Hat / Black Hat SEO | | Rajnish_HE0 -
Help finding website content scraping
Hi, I need a tool to help me review sites that are plagiarising / directly copying content from my site. But tools that I'm aware, such as Copyscape, appear to work with individual URLs and not a root domain. That's great if you have a particular post or page you want to check. But in this case, some sites are scraping 1000s of product pages. So I need to submit the root domain rather than an individual URL. In some cases, other sites are being listed in SERPs above or even instead of our site for product search terms. But so far I have stumbled across this, rather than proactively researched offending sites. So I want to insert my root domain & then for the tool to review all my internal site pages before providing information on other domains where an individual page has a certain amount of duplicated copy. Working in the same way as Moz crawls the site for internal duplicate pages - I need a list of duplicate content by domain & URL, externally that I can then contact the offending sites to request they remove the content and send to Google as evidence, if they don't. Any help would be gratefully appreciated. Terry
White Hat / Black Hat SEO | | MFCommunications0 -
I lost traffic from my website and the rankings also gone down... What should I do?
I've started working on a project recently (for 20 days) and it got average 1800 visitors per day and the ranking were seemed good. When I started that project, I saw that there were too many plugins installed. I removed unnecessary plugins and keep the importance ones. And I started modify some pages considering SEO perspective. Few days ago I created a backlink in reddit; from where I got way too much traffic in the site at the time and the server gone down. So I have to change the server. Now I see the drastic drop down in both ranking and the traffic. I am wondering if the ranking affected when I changed the server? Or Is there any other way to check why my ranking and traffic gone down?
White Hat / Black Hat SEO | | HuptechWebseo0 -
Website starts ranking on Google then always drops - Targeted for Australia but most traffic from U.S - Bounce Rate at 94.49% - HELP!
Hi everyone, Thank you for your time. During the past 8 months I have been working on this website which is a .com.au . I have fully optimised the website which is targeting Brisbane in Australia and I have setup everything (Sitemaps, Geo location on WMT, Fetched as Google etc..) However the website just does not want to rank at all. I know that the previous SEO company were not too good but since then I have disavowed all unnatural links, we have moved the hosting to a new company and the website content has been updated. Only recently the Website has started ranking for it's brand name (not even in top of Google) and whenever a keyword starts ranking above the Top 50 of Google it suddenly drops again. The other issues is that even if I have setup the website to target Australia the majority of traffic comes from the U.S. Last month out of the 127 Session - 85 from United States - 29 from Australia - 3 Brazil - 2 India - 2 Italy - 1 Canada etc... Because of this the website has a Bounce rate of 95%. If you would have any advice, tips or recommendations that I could do to try and fix this it would be much appreciated. I suppose we can consider this as some kind of penalisation - potentially due to the past work and issues that occurred before the business became our client but I am not sure what more I can do to stop the wrong traffic and improve the rankings. Thanks for your help. Lyam
White Hat / Black Hat SEO | | AlphaDigital20 -
Should I delete Meta Keywords from a website?
Hi Guys, I've been reading various posts on the Q&A section here at Moz about Meta keywords. I understand that meta keywords are not relevant with Google and that Bing signals this as spam. I'm optimising existing websites which already have meta keywords in the html coding. My question is: If I delete ALL meta keyword coding will this have any negative impact whatsoever? Thanks Mozers Jason 🙂
White Hat / Black Hat SEO | | Grant-Westfield0 -
Google admits it can take up to a year to refresh/recover your site after it is revoked from Penguin!
I found myself in an impossible situation where I was getting information from various people that seem to be "know it all's" but everything in my heart was telling me they were wrong when it came to the issues my site was having. I have been on a few Google Webmaster Hangouts and found many answers to questions I thought had caused my Penguin Penalty. After taking much of the advice, I submitted my Reconsideration Request for the 9th time (might have been more) and finally got the "revoke" I was waiting for on the 28th of MAY. What was frustrating was on May 22nd there was a Penguin refresh. This as far as I knew was what was needed to get your site back up in the organic SERPS. My Disavow had been submitted in February and only had a handful of links missing between this time and the time we received the revoke. We patiently waited for the next penguin refresh with the surety that we were heading in the right direction by John Mueller from Google (btw.. John is a great guy and really tries to help where he can). The next update came on October 4th and our rankings actually got worse! I spoke with John and he was a little surprised but did not go into any detail. At this point you have to start to wonder WHAT exactly is wrong with the website. Is this where I should rank? Is there a much deeper Panda issue. We were on the verge of removing almost all content from the site or even changing domains despite the fact that it was our brand name. I then created a tool that checked the dates of every last cached date of each link we had in our disavow file. The thought process was that Google had not re-crawled all the links and so they were not factored into the last refresh. This proved to be incorrect,all the links had been re-cached August and September. Nothing earlier than that,which would indicate a problem that they had not been cached in time. i spoke to many so called experts who all said the issue was that we had very few good links left,content issues etc.. Blah Blah Blah, heard it all before and been in this game since the late 90's, the site could not rank this badly unless there was an actual penalty as spam site ranked above us for most of our keywords. So just as we were about to demolish the site I asked John Mueller one more time if he could take a look at the site, this time he actually took the time to investigate,which was very kind of him. he came back to me in a Google Hangout in late December, what he said to me was both disturbing and a relief at the same time. the site STILL had a penguin penalty despite the disavow file being submitted in February over 10 months ago! And the revoke in May. I wrote this to give everyone here that has an authoritative site or just an old one, hope that not all is lots just yet if you are still waiting to recover in Google. My site is 10 years old and is one of the leaders in its industry. Sites that are only a few years old and have had unnatural link building penalties have recovered much faster in this industry which I find ridiculous as most of the time the older authoritative sites are the big trustworthy brands. This explains why Google SERPS have been so poor for the last year. The big sites take much longer to recover from penalties letting the smaller lest trustworthy sites prevail. I hope to see my site recover in the next Penguin refresh with the comfort of knowing that my site currently is still being held back by the Google Penguin Penalty refresh situation. Please feel free to comment below on anything you think is relevant.
White Hat / Black Hat SEO | | gazzerman10 -
Is linking out to different websites with the same C-Block IP bad for SEO?
Many SEOs state that getting (too many) links from the same C-Block IP is bad practice and should be avoided. Is this also applicable if one website links out to different websites with the same C-Block IP? Thus, website A, B and C (on the same server) link to website D (different server) could be seen as spam but is this the same when website D links to website A, B and C?
White Hat / Black Hat SEO | | TT_Vakantiehuizen0 -
Removing/ Redirecting bad URL's from main domain
Our users create content for which we host on a seperate URL for a web version. Originally this was hosted on our main domain. This was causing problems because Google was seeing all these different types of content on our main domain. The page content was all over the place and (we think) may have harmed our main domain reputation. About a month ago, we added a robots.txt to block those URL's in that particular folder, so that Google doesn't crawl those pages and ignores it in the SERP. We now went a step further and are now redirecting (301 redirect) all those user created URL's to a totally brand new domain (not affiliated with our brand or main domain). This should have been done from the beginning, but it wasn't. Any suggestions on how can we remove all those original URL's and make Google see them as not affiliated with main domain?? or should we just give it the good ol' time recipe for it to fix itself??
White Hat / Black Hat SEO | | redcappi0