HREFLANG for multiple country/language combinations
-
We have a site setup with English, German, French, Spanish and Italian. We offer these languages for every European country (over 30). Thus, there are 150 + different URL combinations, as we use the /country/language/ subdirectory path.
Should I list out every combination in hreflang?Or should I simply choose the most applicable combinations (/de/de and fr/fr, etc.)? If we go the latter path, should I block google bot from crawling the atypical combinations?
Best,
Sam
-
Hi Sam,
Apologies for the slow response. Your question slipped through the net.
This is an interesting case!
In an ideal world, you'd specify the relationship between all of those pages, in each direction. That's 150+ tags per page, though, which is going to cause some headaches. Even if you shift the tagging to an XML sitemap, that's a _lot _of weight and processing.
Anecdotally, I know that hreflang tagging starts to break at those kinds of scales (even more so on large sites, at that kind of scale, when the resultant XML sitemaps can reach the size of many gigabytes, or when Google is crawling faster than it's processing the hreflang directives), and so tagging everything isn't going to be a viable approach.
I'd suggest picking out and implementing hreflang for _only _the primary combinations*, as you suggest, and reducing the site-wide mapping to the primary variant in each case.
- You might consider that there may be cases where the valuable/primary combinations aren't just the /xx/xx/ or _/yy/yy/ _versions and that there might be some examples of varying country/language combinations which are worth including.
For the atypical variants, I think that you have a few options:
-
Use meta robots (or x-robots) tags to set noindex attributes. This will keep them out of the index, but doesn't guarantee that you're effectively managing/consolidating value across near duplicates - you may be quietly harming performance without realising it, as those pages represent points of crawl and value wastage/leakage.
-
Use robots.txt to prevent Google from accessing the atypical variants. That won't necessarily stop them from showing up in search results, though, and isn't without problems - you risk you creating crawl dead-ends, writing off the value of any inbound links to those pages, and other issues.
-
You use canonical URLs on all of the atypical variations, referencing the nearest primary version, to attempt to consolidate value/relevance etc. However, that risks the wrong language/content showing up in the wrong country, as you're explicitly _un_optimising the location component.
I think that #1 is the best approach, as per your thinking. That removes the requirement to do anything clever or manipulative with hreflang tagging, and fits neatly with the idea that the atypical combinations aren't useful/valuable enough to warrant their own identities - Google should be smart enough to fall back to the nearest 'generic' equivalent.
I'd also take care to set up your Google Search Console country targeting for each country-level folder, to reduce the risk of people ending up in the wrong sections.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Multiple links from same domain (different pages) considered in credibility of backlinks?
Hi, Let's say there are multiple backlinks from different pages of same domain to different pages of other domain like below: Website A: Page 1 -----------> Website B: Page 1 Website A: Page 2 -----------> Website B: Page 2 Do the pages of Website B pages will get backlinks authority equally or they don't get much backlinks impact as they have multiple backlinks from same domain? There were old school stories that Google ignores second link from same domain.....etc... So, please suggest on this. Thank you. Note: The question is NOT about content relevancy or domain authority score of the backlinks.
Algorithm Updates | | vtmoz1 -
Would there be any benefit to creating multiple pages of the same content to target different titles?
Obviously, the duplicated pages would be canonical, but would there be a way of anchoring a page land by search term entry? For example: If you have a site that sells cars you could use this method but have a page that has (brand) cars for sale, finance options, best car for a family, how far will the (brand) car go for on a full tank and so on? Then making all the information blocks h2's but using the same H2s for the duplicated page titles. Then it gets complicated, If someone searches "best car for a family" and the page title for the duplicated page is clicked how would you anchor this user to the section of the page with this information? Could there be a benefit to doing this or would it just not work?
Algorithm Updates | | Evosite10 -
Any suggestions why I would rank 1 on google and be on 3rd page for bing/yahoo?
Currently the site I'm working on ranks very well on google rankings but then when we cross reference into yahoo and bing we are basically in the graveyard of keywords. (bottom of 3rd page). Why would that be? Any suggestions or things I can do to fix this or troubleshoot it? Here are some things I can think of that might affect this but not sure. 1. our sitemap hasn't been updated in months and URL changes have been made 2. Onsite for yahoo and bing is different from google? 3. Bing is just terrible in general? 4. Inbound links? This one doesn't make sense though unless the search engines rank links in different ways. All jokes aside I would really appreciate any help as currently the few top ranked keywords we have are about 30% of our organic traffic and would have a huge affect on the company if we were able to rank as we should across all platforms. Thanks!
Algorithm Updates | | JemJemCertified0 -
Studies showing that social sharing does/doesn't affect rankings?
I'm currently researching this area in order to show to a client that social shares aren't as valuable for SEO as they might think. Can anyone point me in the direction of the best studies done on this topic? Thanks in advance!
Algorithm Updates | | QubaSEO0 -
Using a sites custom code for multiple websites: good or bad?
Is it bad to utilize a custom codebase for multiple websites? Does that play a factor within Google? Also, what about hosting sites with the same custom codebase on the same dedicated server?
Algorithm Updates | | WebServiceConsulting.com0 -
Have I been Hit by a Penguin? No Warning in Webmaster / Some Pages still Rank
Hi all, I have recently signed up to MOZ as I have seen a large drop in the turnover of a site I work with as well as a slump in visitors. I know part of this slump is the transition from google product search from being free to paid and chewing through our adwords budget quicker. The other part though seems a little more tricky, I have always been under the impression from reading online that an algorithm update would see a site destroyed for most terms and a notification generated in webmaster tools, however the site still seems to still rank for some terms, others however it has fallen off the face of the earth for. As you can see in the attachment webmaster tools is showing much decreased visibility, and MOZ agrees with this. Key terms that have lost rank have done so by around 4-10 positions. The content on the site has all been hand written by myself, however some of the pages are a little "stale" so I am currently running through re-writing every product page on the site (1000 products or so) all my product pages grade a minimum B with 99% A on the Moz page grader. I am keeping my fingers crossed that fresh content should assist in getting google interested again? However my real questions is, Is this Penguin? or is this just stale content? dmDdMr5.jpg pYkzck0.jpg 9f4mgM9.jpg
Algorithm Updates | | speedingorange1 -
Are multiple domains for my website hurting my Google ranking?
Hello, I currently have two domains showing up in google search: shwoodshop.com shop.shwoodshop.com These domains are currently ranked in the #2 and #3 spot, however my page is much more trafficked than the current #1 ranking. I am wondering if the fact that I have two domains competing for the #1 spot is hurting my search ranking. If so, what is the best way to remedy this issue and get back my #1 spot? I'm rather new to SEO and teaching myself as I go, so I appreciate the feedback!
Algorithm Updates | | shwoodshop0 -
SEO Link building / Article Distributation
Quick question in regards to link building and OFF PAGE SEO... Why isn't Articles distributation via wire services considered a "Duplicate Content" issue by Google? i.e. If take the one article and post it accross 50 (do follow) websites, Press release sites, and blogs. I would love to hear your thoughts and feedback on this. Regard, Sammy
Algorithm Updates | | revsystems.com0