HREFLANG for multiple country/language combinations
-
We have a site setup with English, German, French, Spanish and Italian. We offer these languages for every European country (over 30). Thus, there are 150 + different URL combinations, as we use the /country/language/ subdirectory path.
Should I list out every combination in hreflang?Or should I simply choose the most applicable combinations (/de/de and fr/fr, etc.)? If we go the latter path, should I block google bot from crawling the atypical combinations?
Best,
Sam
-
Hi Sam,
Apologies for the slow response. Your question slipped through the net.
This is an interesting case!
In an ideal world, you'd specify the relationship between all of those pages, in each direction. That's 150+ tags per page, though, which is going to cause some headaches. Even if you shift the tagging to an XML sitemap, that's a _lot _of weight and processing.
Anecdotally, I know that hreflang tagging starts to break at those kinds of scales (even more so on large sites, at that kind of scale, when the resultant XML sitemaps can reach the size of many gigabytes, or when Google is crawling faster than it's processing the hreflang directives), and so tagging everything isn't going to be a viable approach.
I'd suggest picking out and implementing hreflang for _only _the primary combinations*, as you suggest, and reducing the site-wide mapping to the primary variant in each case.
- You might consider that there may be cases where the valuable/primary combinations aren't just the /xx/xx/ or _/yy/yy/ _versions and that there might be some examples of varying country/language combinations which are worth including.
For the atypical variants, I think that you have a few options:
-
Use meta robots (or x-robots) tags to set noindex attributes. This will keep them out of the index, but doesn't guarantee that you're effectively managing/consolidating value across near duplicates - you may be quietly harming performance without realising it, as those pages represent points of crawl and value wastage/leakage.
-
Use robots.txt to prevent Google from accessing the atypical variants. That won't necessarily stop them from showing up in search results, though, and isn't without problems - you risk you creating crawl dead-ends, writing off the value of any inbound links to those pages, and other issues.
-
You use canonical URLs on all of the atypical variations, referencing the nearest primary version, to attempt to consolidate value/relevance etc. However, that risks the wrong language/content showing up in the wrong country, as you're explicitly _un_optimising the location component.
I think that #1 is the best approach, as per your thinking. That removes the requirement to do anything clever or manipulative with hreflang tagging, and fits neatly with the idea that the atypical combinations aren't useful/valuable enough to warrant their own identities - Google should be smart enough to fall back to the nearest 'generic' equivalent.
I'd also take care to set up your Google Search Console country targeting for each country-level folder, to reduce the risk of people ending up in the wrong sections.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Multiple links from same domain (different pages) considered in credibility of backlinks?
Hi, Let's say there are multiple backlinks from different pages of same domain to different pages of other domain like below: Website A: Page 1 -----------> Website B: Page 1 Website A: Page 2 -----------> Website B: Page 2 Do the pages of Website B pages will get backlinks authority equally or they don't get much backlinks impact as they have multiple backlinks from same domain? There were old school stories that Google ignores second link from same domain.....etc... So, please suggest on this. Thank you. Note: The question is NOT about content relevancy or domain authority score of the backlinks.
Algorithm Updates | | vtmoz1 -
Would there be any benefit to creating multiple pages of the same content to target different titles?
Obviously, the duplicated pages would be canonical, but would there be a way of anchoring a page land by search term entry? For example: If you have a site that sells cars you could use this method but have a page that has (brand) cars for sale, finance options, best car for a family, how far will the (brand) car go for on a full tank and so on? Then making all the information blocks h2's but using the same H2s for the duplicated page titles. Then it gets complicated, If someone searches "best car for a family" and the page title for the duplicated page is clicked how would you anchor this user to the section of the page with this information? Could there be a benefit to doing this or would it just not work?
Algorithm Updates | | Evosite10 -
Google's Mobile Update: What We Know So Far (Updated 3/25)
We're getting a lot of questions about the upcoming Google mobile algorithm update, and so I wanted to start a discussion that covers what we know at this point (or, at least, what we think we know). If you have information that contradicts this or expands on it, please feel free to share it in the comments. This is a developing situation. 1. What is the mobile update? On February 26th, Google announced that they would start factoring in mobile-friendliness as a ranking signal. The official announcement is here. Of note, "This change will affect mobile searches in all languages worldwide and will have a significant impact in our search results." 2. When will the update happen? In an unprecedented move, Google announced that the algorithm update will begin on April 21st. Keep in mind that the roll-out could take days or weeks. 3. Will this affect my desktop rankings? As best we know - no. Mobile-friendliness will only impact mobile rankings. This is important, because it suggests that desktop and mobile rankings, which are currently similar, will diverge. In other words, even though desktop and mobile SERPs look very different, if a site is #1 on desktop, it's currently likely to be #1 on mobile. After April 21st, this may no longer be the case. 4. Is this a boost or a demotion? This isn't clear, but practically it doesn't matter that much and the difference can be very difficult to measure. If everyone gets moved to the front of the line except you, you're still at the back of the line. Google has implied that this isn't a Capital-P Penalty in the sense we usually mean it. Most likely, the mobile update is coded as a ranking boost. 5. Is this a domain- or page-based update? At SMX West, Google's Gary Ilyes clarified that the update would operate on the page level. Any mobile-friendly page can benefit from the update, and an entire site won't be demoted simply because a few pages aren't mobile friendly. 6. Is mobile-friendly on a scale or is it all-or-none? For now, Google seems to be suggesting that a page is either mobile-friendly or not. Either you make the cut or you don't. Over time, this may evolve, but expect the April 21st launch to be all-or-none. 7. How can I tell if my site/page is mobile-friendly? Google has provided a mobile-friendly testing tool, and pages that are mobile-friendly should currently show the "Mobile-friendly" label on mobile searches (this does not appear on desktop searches). Some SEOs are saying that different tools/tests are showing different results, and it appears that the mobile-friendly designation has a number of moving parts. 8. How often will mobile data refresh? Gary also suggested (and my apologies for potentially confusing people on Twitter) that this data will be updated in real-time. Hopefully, that means we won't have to worry about Penguin-style updates that take months to happen. If a page or site becomes mobile-friendly, it should benefit fairly quickly. We're actively working to re-engineer the MozCast Project for mobile rankings and have begun collecting data. We will publish that data as soon as possible after April 21st (assuming it;s useful and that Google sticks to this date). We're also tracking the presence of the "Mobile-friendly" tag. Currently (as of 3/25), across 10,000 page-1 mobile results, about 63% of URLs are labeled as "Mobile-friendly". This is a surprisingly large number (to me, at least) - we'll see how it changes over time.
Algorithm Updates | | Dr-Pete15 -
Ecommerce good/bad? Showing product description on sub/category page?
Hi Mozers, I have a ecommerce furniture website, and I have been wondering for some time if showing the product descriptions on the sub/category page helps the website. If there is more content displayed on the subcategory, it should be more relevant, right? OR does it not matter, as it is duplicate content from the product page. I think showing the product descriptions on non-product pages is hurting my design/flow, but i worry that if I am to hide product content on sub/category pages my traffic will be hurt. Despite my searches I have not found an answer yet. Please take a look at my site and share your thoughts: http://www.ecustomfinishes.com/ Chris 27eVz
Algorithm Updates | | longdenc_gmail.com0 -
Multiple Listings in Results fading Local SEO
Lately I am noticing multiple listings for results seem to be fading away. Example is one domain being listed twice for a search phrase The Home page for example and an Internal Page. Is anyone else seeing this? Safe to say Google wants to see 10+ individual domains per results page?
Algorithm Updates | | bozzie3110 -
How to Link a Network of Sites w/o Penguin Penalties (header links)
I work for a network of sites that offer up country exclusive content. The content for the US will be different than Canada, Australia, Uk, etc.… but with the same subjects. Now to make navigation easy we have included in the header of every page a drop down that has links to the other countries, like what most of you do with facebook/twitter buttons. Now every page on every site has the same link, with the same anchor text. Example: Penguins in Canada Penguins in Australia Penguins in the USA Because every page of every site has the same links (it's in the header) the "links containing this anchor text" ratio is through the roof in Open Site Explorer. Do you think this would be a reason for penguin penalization? If you think this would hurt you, what would you suggest? no follow links? Remove the links entirely and create a single page of links? other suggestions?
Algorithm Updates | | BeTheBoss0 -
Site-wide Footer Link on Client/Friend Website - Dangerous?
Hi Guys, I've got a friend / client / business associate who's website I helped develop. It's a three letter dot-com, so good trust, and an eCommerce site, so lot's of pages. When I launched my new site about 6 weeks ago I put "Official IT Partner of MySite.com" in the footer. No keywords in the anchor text, just the domain URL... There are no other external links like that on the site whatsoever, and I haven't been hit by Penguin. I'm ranking well for local targeted keywords a few weeks after launch, and traffic continues to increase... I am worried that Google will see this is unnatural, but I've received no warning or experienced any decline in rankings. There's about 2800 pages linking from the site to my site, all in the footer of course. Would it be better to remove the link from the footer and add it just to the home page and a couple of other high authority pages, or should I leave it be. It's not "unnatural", I am affiliated with the site and work in partnership with the site, but it does fit that profile. I'm thinking about removing the footer link and adding a small graphic on the home page of the linking site which links to my root domain, with a couple of broad keyword anchored links in a description underneath that also link to relevant pages on my site... What do you think? 2800 links w/ my URL as anchor text from high Domain Authority / Low Page Authority pages (the homepage and a few other pages have decent authority) to my root domain OR Three different links from one High DA/ High PA homepage (one image alt, two anchored w/ broad keywords) to three different pages on my site. Again, there are no other site-wide external links on the domain, and I'm pretty sure I escaped the Penguin. Looking forward to hearing the different points of view. Thanks, Anthony
Algorithm Updates | | Anthony_NorthSEO2 -
Are multiple domains for my website hurting my Google ranking?
Hello, I currently have two domains showing up in google search: shwoodshop.com shop.shwoodshop.com These domains are currently ranked in the #2 and #3 spot, however my page is much more trafficked than the current #1 ranking. I am wondering if the fact that I have two domains competing for the #1 spot is hurting my search ranking. If so, what is the best way to remedy this issue and get back my #1 spot? I'm rather new to SEO and teaching myself as I go, so I appreciate the feedback!
Algorithm Updates | | shwoodshop0