Hreflang usage for language & country x only language
-
Hi guys,
I´m dealing with a website of a client where hreflang tags are implemented as follows:
As you can see the hreflang tags reference language & countrycode as well as only the languagecode with the same URL (for french: website/fr/ihr-besuch/online-tickets" hreflang="fr-fr" as well as hreflang="fr" href="https://www.website/fr/ihr-besuch/online-tickets").
Is this a problem and should be corrected so that either language & countrycode is referenced or only languagecode?
Thanks in advance!
-
HI,
I don't know if it would make a huge difference in how google will understand the site but it is not a 'normal' setup as per documentation: https://support.google.com/webmasters/answer/189077?hl=en
If the site does not have different regional variation pages of english (UK vs USA for example) then there is no need to double up hreflang markup with regional notation. You just need the hreflang='en' and it is good for all english language searches regardless of location. Same goes for all the languages.
Personally I would remove the extra regional tags to avoid confusion and keep things as simple as possible.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Amp version of website
Hello & thanks for reading its maybe the monday morning blues but i have two versions of a website - www.gardeners.scot and www.gardeners.scot/AMP/ the pages on the amp version have canonicals pointing to the "normal" website Should the links on "www.example.com/AMP/" point to the amp website or the normal website? what are your thougths?
Technical SEO | | livingphilosophy0 -
Buying multiple domains: misspells & .net, org, etc. & 301's
Hi, an SEO guy told me to buy up domains like ours X.org, net, biz, etc. & mispellings. this could cost over $100/year. Is is worth it for SEO or is it just covering our @ss if competitors want to get stupid and buy those? I don't forsee competitors doing that. What do you suggest? Does Google actually give us points for those AND if we bought them are we supposed to redirect all of them to our site? Should I be doing this for our SEO clients? Thanks.
Technical SEO | | JCunningham0 -
Wordpress & use of 'www' vs not for webmaster tools - explanation needed
I am having a hard time understanding the issue of canonization of site pages, specifically in regards to the 'www' or 'non-www' versions of a site. And specifically in regards to wordpress. I can see that it doesn't matter whether you type in 'www' or not in the url for a wordpress site, what is going on in the back end that allows this? When I link up to google webmaster tools, should i use www or not? thanks for any help d
Technical SEO | | dnaynay0 -
Ajax #! URLs, Linking & Meta Refresh
Hi, We recently underwent a platform change and unfortunately our updated ecom site was coded using java script. The top navigation is uncrawlable, the pertinent product copy is undetectable and duplicated throughout the code, etc - it needs a lot of work to make it (even somewhat) seo-friendly. We're in the process of implementing ajax #! to our site and I've been tasked with creating a document of items that I will test to see if this solution will help our rankings, indexing, etc (on Google, I've read the issues w/ Bing). I have 2 questions: 1. Do I need to notify our content team who works on our linking strategy about the new urls? Would we use the #! url (for seo) or would we continue to use the clean url (without the #!) for inbound links? 2. When our site transferred over, we used meta refresh on all of the pages instead of 301s for some reason. Instead of going to a clean url, our meta refresh says this: . Would I update it to have the #! in the url? Should I try and clean up the meta refresh so it goes to an actual www. url and not this browsererrorview page? Or just push for the 301? I have read a ton of articles, including GWT docs, but I can't seem to find any solid information on these specific questions so any help I can get would be greatly appreciated. Thanks!
Technical SEO | | Improvements0 -
Schema.org : reviewCount & ratingValue - best practices for implementation?
I'd like to add our merchant review count and rating to our site and use the schema.org markup to indicate to the search engines what these are. The reason I'd like to do this is so that the star rating pulls through to the organic listing. Check out this example from several UK tight sites. Notice how the organic listings display the star rating... My questions are: Has anyone seen an example of this from Google.com (US site) I heard that you should only add this markup to the homepape - but couldn't find any Google documentation to back this up. Do you know if this can be applied throughout the site w/o penalty? Thanks everyone!
Technical SEO | | evoNick0 -
Best geotargeting strategy: Subdomains or subfolders or country specific domain
How have the relatively recent changes in how G perceives subdomains changed the best route to onsite geotargeting i.e. not building out new country specific sites on country specific and hosted domains and instead developing sub-domains or sub-folders and geo-targeting those via webmaster tools ? In other words, given the recent change in G perception, are sub-domains now a better option than a sub-folder or is there not much in it ? Also if client has a .co.uk and they want to geo-target say France, is the sub-domain/sub-folder route still an option or is the .co.uk still too UK specific, and these options would only work using a .com ? In other words can sites on country specific domains (.co.uk , .fr, .de etc etc) use sub-folders or domains to geo-target other countries or do they have no option other than to develop new country specific (domains/hosting/language) websites ? Any thoughts regarding current best practice in this regard much appreciated. I have seen last Febs WBF which covers geotargeting in depth but the way google perceives subdomains has changed since then Many Thanks Dan
Technical SEO | | Dan-Lawrence0 -
Duplicate Content via a product feed & data
We have uniquely created all of our product content on our website (Titles, product descriptions, images etc). However, we are also a manufacturer of these products and supply to a number of trade customers. These customers often wish to setup their own websites to re-sell these products. In the past we have quite happily given this content in order to assist our customers sell on their sites. Generally we give them a 'data dump' of our web data and images, but reading about duplicate content this will lead to the search engines seeing lots of identical content on these customer sites. Whilst we wish to support our customers we do not want to harm our (and their) site by issuing lots of duplicate content around the web. Is there a way we can help them with the data without penalizing ourselves? The other issue is that we also take this data feed and use it to sell on both Amazon & Googlebase. Will using this identical data also rank as duplicate content as a quick search does show both our website and amazon product page? When creating Amazon listing do these need to vary from the standard website descriptions? Thanks
Technical SEO | | bwfc770 -
Should i use NoIndex, Follow & Rel=Canonical Tag In One Page?
I am having pagination problem with one of my clients site , So I am deciding to use noindex, follow tag for the Page 2,3,4 etc for not to have duplicated content issue, Because obviously SEOMoz Crawl Diagnostics showing me lot of duplicate page contents. And past 2 days i was in constant battle whether to use noindex, follow tag or rel=canonical tag for the Page 2,3,4 and after going through all the Q&A,None of them gives me crystal clear answer. So i thought "Why can't i use 2 of them together in one page"? Because I think (correct me if i am wrong) 1.noindex, follow is old and traditional way to battle with dup contents
Technical SEO | | DigitalJungle
2.rel=canonical is new way to battle with dup contents Reason to use 2 of them together is: Bot finds to the non-canonical page first and looks at the tag nofollow,index and he knows not to index that page,meantime he finds out that canonical url is something something according to the url given in the tag,NO? Help Please???0