Hreflang/Canonical Inquiry for Website with 29 different languages
-
Hello,
So I have a website (www.example.com) that has 29 subdomains (es.example.com, vi.example.com, it.example.com, etc).
Each subdomain has the exact same content for each page, completely translated in its respective language.
I currently do not have any hreflang/canonical tags set up.
I was recently told that this (below) is the correct way to set these tags up
-For each subdomain (es.example.com/blah-blah for this example), I need to place the hreflang tag pointing to the page the subdomain is on (es.example.com/blah-blah), in addition to every other 28 subdomains that have that page (it.example.com/blah-blah, etc). In addition, I need to place a canonical tag pointing to the main www. version of the website. So I would have 29 hreflang tags, plus a canonical tag.
When I brought this to a friends attention, he said that placing the canonical tag to the main www. version would cause the subdomains to drop out of the SERPs in their respective country search engines, which I obviously wouldn't want to do.
I've tried to read articles about this, but I end up always hitting a wall and further confusing myself. Can anyone help? Thanks!
-
_For each subdomain (es.example.com/blah-blah for this example), I need to place the hreflang tag pointing to the page the subdomain is on (es.example.com/blah-blah), in addition to every other 28 subdomains that have that page (it.example.com/blah-blah, etc). In addition, I need to place a canonical tag pointing to the main www. version of the website. So I would have 29 hreflang tags, plus a canonical tag. _
Everything correct but the canonical part (but maybe I misunderstood what you wrote).
If the different country targeting pages are in different languages, then you don't have to point the rel="canonical" to the main www. version. NOT AT ALL, because they are not identical. You will start seeing the search snippets of the URLs of those geo-targeted versions (shown because of the hreflang) using the title tag and meta description of the www. version page. So, for instance, the search snippet of the Italian version having the Italian URL but everything else in English. If you need to use the rel="canonical" it should be self-referential (if not another in same cases, but of the same subdomain)
-
Hi,
Probably the easiest solution in your case is to use the geo-targeting settings in Google Webmaster tools (but only if each of your subdomains is targeting a specific country - not a specific language).
If you want to use hreflang - there is quite a good post on it on Moz (http://moz.com/blog/hreflang-behaviour-insights) - must admit I personally never used it.
rgds,
Dirk
-
If your translations are automated, Google requests that you don't index them, but it sounds like you've created fully translated, static pages. Here's Google's info on that, "Q: Can I use automated translations?
A: Yes, but they must be blocked from indexing with the “noindex” robots meta tag. We consider automated translations to be auto-generated content, so allowing them to be indexed would be a violation of our Webmaster Guidelines." Maybe this is where someone had confusion... Anyways, here's their larger FAQ on it: https://sites.google.com/site/webmasterhelpforum/en/faq-internationalisation. Fully done translations are considered canonical within their own languages, so no need to point to the www version as canonical.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Whitehat site suffering from drastic & negative Keyword/Phrase Shifts out of the blue!
I am the developer for a fairly active website in the education sector that offers around 30 courses and has quite an actively published blog a few times a week and social profiles. The blog doesn't have comments enabled and the type of visitor that visits is usually looking for lessons or a course. Over the past year we have had an active input in terms of development to keep the site up to date, fast and following modern best practises. IE SSL certificates, quality content, relevant and high powered backlinks ect... Around a month ago we got hit by quite a large drop in our ranked keywords / phrases which shocked us somewhat.. we attributed it to googles algorithm change dirtying the waters as it did settle up a couple of weeks later. However this week we have been smashed again by another large change dropping almost 100 keywords some very large positions. My question is quite simple(I wish)... What gives? I don't expect to see drops this large from not doing anything negative and I'm unsure it's an algorithm change as my other clients on Moz don't seem to have suffered either so it's either isolated to this target area or it's an issue with something occurring to or on the site? QfkSttI T42oGqA
White Hat / Black Hat SEO | | snowflake740 -
Increase in spammy links from image gallery websites i.e. myimagecollection.net
Hi there I've recently noticed a lot of spammy links coming from image gallery sites that all look the same, i.e.: http://mypixlibrary.co/ http://hdimagegallery.net/ http://myimagecollection.net/ http://pixhder.com/ Has anyone else seen links from these? They have no contact details, not sure if they are some form of negative SEO or site spam. Any ideas how to get rid? Thanks
White Hat / Black Hat SEO | | Kerry_Jones0 -
Link Building / Link Removal
Hey, I'm in the process of learning SEO, or attempting to, for my company and am chipping away at the process ever so slowly! How can I tell if a site that links to my company's site, www.1099pro.com, has a negative effect on my page/domain authority? Also, if a page doesn't show up in the search rankings at all for it's keywords when it really should (i.e it has the exact keywords and page/domain authority far surpasses even the top results) how can I tell if Google has removed the page from its listing and why? Thanks SEO Gurus
White Hat / Black Hat SEO | | Stew2220 -
Would it be a good idea to duplicate a website?
Hello, here is the situation: let's say we have a website www.company1.com which is 1 of 3 main online stores catering to a specific market. In an attempt to capture a larger market share, we are considering opening a second website, say www.company2.com. Both these websites have a different URL, but offer the same products for sale to the same clientele. With this second website, the theory is instead of operating 1 of 3 stores, we now operate 2 of 4. We see 2 ways of doing this: we launch www.company2.com as a copy of www.company1.com. we launch www.company2.com as a completely different website. The problem I see with either of these approaches is duplicate content. I think the duplicate content issue would be even more or a problem with the first approach where the entire site is mostly a duplicate. With the second approach, I think the duplicate content issue can be worked around by having completely different product pages and overall website structure. Do you think either of these approaches could result in penalties by the search engines? Furthermore, we all know that higher ranking/increased traffic can be achieved though high quality unique content, social media presence, on-going link-building and so on. Now assuming we have a fixed amount of manpower to provide for these tasks; do you think we have better odds of increasing our overall traffic by sharing the manpower on 2 websites, or putting it all behind a single one? Thanks for your help!
White Hat / Black Hat SEO | | yacpro130 -
Is Inter-linking websites together good or bad for SEO?
I know of a website that inter-links a handful of websites together (ex- coloring.ws interlinks to a handful of other sites, including dltk-kids.com, and others). Is this negative for SEO? I was thinking about creating a few related sites and inter-linking all of them together, since they will all be relevant to each other. Any thoughts would be great!
White Hat / Black Hat SEO | | WebServiceConsulting.com0 -
LOCAL SEO / Ranking for the difficult 'service areas' outside of the primary location?
It's generally not too hard to rank in Google Places and organically for your primary location. However if you are a service area business looking to rank for neighboring cities or service areas, Google makes this much tougher. Andrew Shotland mentions the obvious and not so obvious options: Service Area pages ranking organically, getting a real/virtual address, boost geo signals, and using zip codes instead of service area circle. But I am wondering if anyone had success with other methods? Maybe you have used geo-tagging in a creative way? This is a hurdle that many local business are struggling with and any experience or thoughts will be much appreciated
White Hat / Black Hat SEO | | vmialik1 -
No Follows - Sister/manufacturer sites
What is the best practice nowadays for linking to sister sites? Should you do it, shouldn't you, and/or should you list them with no follows? What about the reverse - having them link to us. Is this bad for us in anyway? Should we have them no follow their link to us? We are a distributor so manufacturers link to us as well, should we have them no follow their links? Thanks!
White Hat / Black Hat SEO | | CHECOM0 -
Webiste Ranking Differently Based on IP/Data Center
I have a site which I thought was ranking well, however that doesn't seem to be the case. When I check the site from different IPs within the US it shows that the site is on page 1 and on other IPs it shows that it's on page 5 and for some keywords it shows it's not listed. This site was ranking well, before but I think google dropped it when I was giving putting in too much work with it (articles and press releases), but now it seems to have recovered when I check with my IP, but on other data centers it still shows it prior to recovering. It was able to recover after not building links to for a period of time, it showed it moved back up from the data center I'm connected to, but it still shows the possibly penalized results on other data centers. Is it possible that site is still penalized? So the question is why does it show it recovered in some data centers and not others? How do I fix this? It's been about 2 months since it's recovered from some data centers. Is this site still penalized or what's going on? There are no warnings in web master tools. Any insights would be appreciated! This isn't an issue with the rank tracking software, I've tested this on a multitude of IPs with varying differences. Thanks!
White Hat / Black Hat SEO | | seomozzy0