HREFLANG for multiple country/language combinations
-
We have a site setup with English, German, French, Spanish and Italian. We offer these languages for every European country (over 30). Thus, there are 150 + different URL combinations, as we use the /country/language/ subdirectory path.
Should I list out every combination in hreflang?Or should I simply choose the most applicable combinations (/de/de and fr/fr, etc.)? If we go the latter path, should I block google bot from crawling the atypical combinations?
Best,
Sam
-
Hi Sam,
Apologies for the slow response. Your question slipped through the net.
This is an interesting case!
In an ideal world, you'd specify the relationship between all of those pages, in each direction. That's 150+ tags per page, though, which is going to cause some headaches. Even if you shift the tagging to an XML sitemap, that's a _lot _of weight and processing.
Anecdotally, I know that hreflang tagging starts to break at those kinds of scales (even more so on large sites, at that kind of scale, when the resultant XML sitemaps can reach the size of many gigabytes, or when Google is crawling faster than it's processing the hreflang directives), and so tagging everything isn't going to be a viable approach.
I'd suggest picking out and implementing hreflang for _only _the primary combinations*, as you suggest, and reducing the site-wide mapping to the primary variant in each case.
- You might consider that there may be cases where the valuable/primary combinations aren't just the /xx/xx/ or _/yy/yy/ _versions and that there might be some examples of varying country/language combinations which are worth including.
For the atypical variants, I think that you have a few options:
-
Use meta robots (or x-robots) tags to set noindex attributes. This will keep them out of the index, but doesn't guarantee that you're effectively managing/consolidating value across near duplicates - you may be quietly harming performance without realising it, as those pages represent points of crawl and value wastage/leakage.
-
Use robots.txt to prevent Google from accessing the atypical variants. That won't necessarily stop them from showing up in search results, though, and isn't without problems - you risk you creating crawl dead-ends, writing off the value of any inbound links to those pages, and other issues.
-
You use canonical URLs on all of the atypical variations, referencing the nearest primary version, to attempt to consolidate value/relevance etc. However, that risks the wrong language/content showing up in the wrong country, as you're explicitly _un_optimising the location component.
I think that #1 is the best approach, as per your thinking. That removes the requirement to do anything clever or manipulative with hreflang tagging, and fits neatly with the idea that the atypical combinations aren't useful/valuable enough to warrant their own identities - Google should be smart enough to fall back to the nearest 'generic' equivalent.
I'd also take care to set up your Google Search Console country targeting for each country-level folder, to reduce the risk of people ending up in the wrong sections.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Less relevant/not optimized competitor sites ranking higher in SERPs?
Has anyone else noticed their rank positions falling to competitor sites that aren't optimized and are less relevant? I've noticed that we've lost some rankings or have dropped over the past few weeks and the competitor pages that have replaced us haven't been optimized, aren't as relevant, and it doesn't look like there has been any updates (looking through archived versions). For example, their main "shoes" gallery is ranking for more specific shoe types, like "sandals", and "sandals" isn't even mentioned in their metadata and they have no on-page copy. Their DA is slightly higher, but our sites have a denser link profile (although, yes, I do need to go through and see what kind of links, exactly, we've gained). Has anyone else seen this happen recently, or have any ideas of why or what we could do to get our rank positions back? My main initiatives have been to create and implement fresh on-page copy, metadata, and manage 404s/301 redirects, but I'm thinking this issue is beyond a quick copywriting tweak.
Algorithm Updates | | WWWSEO0 -
Site titles / descriptions change - Google Algo Change ?
Hello, During the weekend 4 of our sites automatically changed their search titles and descriptions at the same time.
Algorithm Updates | | lordish
They are not picking up the real pages: Title, Description. Our ranks are dropping because of this. can you please tell if it happened to you as well or if you recognize a problem here? sites:
http://www.robinhoodbingo.com
http://www.gossipbingo.com
http://www.moonbingo.com in the attached examples:
for the kws searched - the results show different titles and descriptions. results for these pages:
moon bingo - http://www.moonbingo.com
mobile bingo - http://www.robinhoodbingo.com/skin/mobile.php rhMzURw.png 2tRL5dZ.png0 -
Do panda/penguin algorithm updates hit websites or just webpages ?
If I have a website that been affected by the panda/penguin update, do bad links affect the entire site or just the page the bad link(s) are linked to? If it is the latter and penguin/panda actually affect webpages, not websites (as is the common reference/conception), then wouldn't simply creating a new URL, targeting this new URL, shifting meta-tags and restarting link-building efforts again (this time using the right quality strategies) be a really common-sense approach instead of the tediousness of the disavow approach that so many go down?
Algorithm Updates | | Gavo0 -
Recovered from penguin/panda but which one?
So the good news is that for the first time since April 24th, one of our websites is back in the search results as of around December 12 but I am still unsure as whether it was panda or penguin (or both) that was impacting the site?? Note this was not a manual penalty. I diagnosed it as a penguin issue (drop on April 24th, aggressive on-page optimisation, around 10% of links from spammy directories like addyourfreelinks.com with anchor text built by a questionable agency), but on further advice it was thought that panda was also an issue because it is a hotel microsite so there was duplication with our own brand site and across third party travel sites and there were a number of pages with bare content. I figured it was a good time to clean everything up to address both. Here is a summary of actions taken: submitted disavow file on October 24th with all questionable links including actions taken and comments. Since then I have cleaned up some content so it is less aggressively targeting certain keywords. Amended several third party listings with duplicate content No follow,indexed pages that were directly duplicated with our brand site and over the last month have built a few good quality links. Cleaned up 404's in webmaster tools over the last week I have searched to see if there were any algorithm updates around December 12 but cannot find any mentions. Thoughts?
Algorithm Updates | | jay.raman0 -
Local SEO-How to handle multiple business at same address
I have a client who shares the same address and suite number with multiple business. What should be done to optimize their website and citations for local SEO? Is this a huge issue? What should we do so our rankings aren't affected. Will changes take a long time to take place? Thanks
Algorithm Updates | | caeevans0 -
Advice / Help with simliar post titles
Does anyone know if the new google ove optimisation penalties would be actioned for similar post titles? We have a back catalouge of our magazines online and each title is very simliar ie magazine name - issue number - month. These have been flagged up by the SEO Moz pro service as being duplicate content but although each post is laid out the same each has different content with int it. the titles however are all very simliar - is this a bad idea and also if so what would be a better way of grouping the posts so that the titles are different enough - would the main subject of each magazine included be enough? Thanks
Algorithm Updates | | luwhosjack0 -
Any health bloggers/writers out there want to help in an experiment?
Hey mozzers, I'm trying a post-penguin experiment with a brand new website. New domain, separate hosting from all my other stuff, all that. It is run on an article script that allows new authors to make articles and post them. It is all health related and there are no other topics. I'm trying to see if a brand new site can get something to rank quickly with no SEO being done. No links of any kind, just social media sharing of articles. Authors can put links in their articles to anywhere they want, so if you want to help your health site out with a link, go for it. I'm looking for any writers who want to join us in this experiment. i figure if we get a lot of content on there, all original, no spun, natural links, we can do well. if so, I'm abandoning all penguin hit domains and starting over. And you all can get some links out of it. I've already got me and my employees, so the more the merrier. Just let me know if interested.
Algorithm Updates | | DanDeceuster2 -
Question about Local / Regional SEO
Good Morning Moz Community, I have a local SEO/regional SEO question. I apologize if this question is duplicated from another area on this forum but, a query of the term Regional SEO showed no results, as did similar queries. Please preference this entire question with "Knowing what we know about the most recent changes to local search" I know what has worked in the past, my concern is Now. Working with a heavily regulated client that is regional, mostly East Coast US. They are in Financial Services and state licensing is a requirement. They are licensed in 15 states. Obviously, it would look foolish, in this day in age, to Title Tag individual pages with local modifiers and have numerous pages covering a similar topic with not much difference than localized modifiers in front of the keyword. I've never found that SE's can understand broad regional terms such as New England or Mid Atlantic or Southeast or Northeast, if someone knows different please share. Aside from an exact match search. The client does have 7 offices in various states. Perfectly matching and consistent listings in G Places, Bing Local and Yahoo Local was step one and all their locations are now in those services and there are many more smaller local citation listings are in the works. We have also successfully implemented a plan to generate great reviews from actual customers, for each location, they're receiving a few a day right now. Their local places listings, where they have physical locations, are doing very well but: 1. What would the community's suggestion be on generating more targeted traffic in the 8 states where they have no physical location? 2. The client wants to begin creating smaller blogs that are highly localized to the states and major population centers that they do not have a physical location in. There is an open check book to dedicate to this effort however, I do a lot of work in this industry so I want to offer the best possible, most up to date advice, my concern is that these efforts will have two results: a. be obscured by the ”7 pack" by companies with local brick and mortar b. would detract from the equity built in their existing blog by generating content in other domains, I would prefer to continue growing the main blog. 3. As a follow up, it has been documented that Google is now using the same algorithm for local, personal and personalized, that being the case, is there any value in building links to you Places page? Can you optimize your Places page by using the same off site techniques as you would traditionally? Sorry to kill you with such a long question on a Sunday 🙂
Algorithm Updates | | dogflog1