vs.
-
I have a site that is based in the US but each page has several different versions for different regions. These versions live in folders (/en-us for the US English version, /en-gb for the UK English version, /fr-fr for the French version, etc.). Obviously, the French pages are in French. However, there are two versions of the site that are in English with little variation of the content. The pages all have a tag to indicate the language the page is in. However, there are no <hreflang>tags to indicate that the pages are the same page in two different languages.</hreflang>
My question is, do I need to go through and add the <hreflang>tags to each page to reference each other and identify to Google that these are duplicate content issues, but different language versions of the same content? Or, will Google figure that our from the tag?</hreflang>
-
Without Hreflang markup the en-US and en-GB pages will be treated as duplicate content. You do not want that. In fact, even with hreflang the two may be considered duplicates if there isn't enough differentiated content.
Also, be careful with canonicals. You shouldn't specify the en-US page as the canonical URL for the fr page. The fr page is its own page and you should use hreflang to specify other language versions.
-
Thanks, Martijn. The pages all have self-referencing canonical tags (except for the blog posts which have all non-US English pages referencing the US English version as the canonical page.
I'm going to be safe and implement the HREF Lang tags. Do you think the self-referencing canonical tags on each version of the page are going to cause a problem?
-
Hi Mike,
I definitely wouldn't trust only on using the HTML Lang Tag, as that's something that isn't used a lot by sites in the end. Plus it's a vague indicator to Google that that is the actual language that is being used there. I would go with stating the different pages with the HREF Lang tag and worst case go with a canonical tag implementation.
Martijn.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Disallowing URL Parameters vs. Canonicalizing
Hi all, I have a client that has a unique search setup. So they have Region pages (/state/city). We want these indexed and are using self-referential canonicals. They also have a search function that emulates the look of the Region pages. When you search for, say, Los Angeles, the URL changes to _/search/los+angeles _and looks exactly like /ca/los-angeles. These search URLs can also have parameters (/search/los+angeles?age=over-2&time[]=part-time), which we obviously don't want indexed. Right now my concern is how best to ensure the /search pages don't get indexed and we don't get hit with duplicate content penalties. The options are this: Self-referential canonicals for the Region pages, and disallow everything after the second slash in /search/ (so the main search page is indexed) Self-referential canonicals for the Region pages, and write a rule that automatically canonicalizes all other search pages to /search. Potential Concern: /search/ URLs are created even with misspellings. Thanks!
Technical SEO | | Alces1 -
Personalized Content Vs. Cloaking
Hi Moz Community, I have a question about personalization of content, can we serve personalized content without being penalized for serving different content to robots vs. users? If content starts in the same initial state for all users, including crawlers, is it safe to assume there should be no impact on SEO because personalization will not happen for anyone until there is some interaction? Thanks,
Technical SEO | | znotes0 -
Single page website vs Google
Hi, I was wondering on this issue: There is a website for guesthouse. It has all information on one page (it is a valid page, with legitimate content). How google treats those pages? Would it treat it as Doorway Page? Or give some other penalties? What about a bounce rate? Because it will be pretty high, as there is no option to go somewhere else? What is your opinion on single page websites - SEO wise? Is it a shot in the foot? Thanks!
Technical SEO | | LeszekNowakowski0 -
Google Search Console Site Map Anomalies (HTTP vs HTTPS)
Hi I've just done my usual Monday morning review of clients Google Search Console (previously Webmaster Tools) dashboard and disturbed to see that for 1 client the Site Map section is reporting 95 pages submitted yet only 2 indexed (last time i looked last week it was reporting an expected level of indexed pages) here. It says the sitemap was submitted on the 10th March and processed yesterday. However in the 'Index Status' its showing a graph of growing indexed pages up to & including yesterday where they numbered 112 (so looks like all pages are indexed after all). Also the 'Crawl Stats' section is showing 186 pages crawled on the 26th. Then its listing sub site-maps all of which are non HTTPS (http) which seems very strange since the site is HTTPS and has been for a few months now and the main sitemap index url is an HTTPS: https://www.domain.com/sitemap_index.xml The sub sitemaps are:http://www.domain.com/marketing-sitemap.xmlhttp://www.domain.com/page-sitemap.xmlhttp://www.domain.com/post-sitemap.xmlThere are no 'Sitemap Errors' reported but there are 'Index Error' warnings for the above post-sitemap, copied below:_"When we tested a sample of the URLs from your Sitemap, we found that some of the URLs were unreachable. Please check your webserver for possible misconfiguration, as these errors may be caused by a server error (such as a 5xx error) or a network error between Googlebot and your server. All reachable URLs will still be submitted." _
Technical SEO | | Dan-Lawrence
Also for the below site map URL's: "Some URLs listed in this Sitemap have a high response time. This may indicate a problem with your server or with the content of the page" for:http://domain.com/en/post-sitemap.xmlANDhttps://www.domain.com/page-sitemap.xmlAND https://www.domain.com/post-sitemap.xmlI take it from all the above that the HTTPS sitemap is mainly fine and despite the reported 0 pages indexed in GSC sitemap section that they are in fact indexed as per the main 'Index Status' graph and that somehow some HTTP sitemap elements have been accidentally attached to the main HTTPS sitemap and the are causing these problems.What's best way forward to clean up this mess ? Resubmitting the HTTPS site map sounds like right option but seeing as the master url indexed is an https url cant see it making any difference until the http aspects are deleted/removed but how do you do that or even check that's what's needed ? Or should Google just sort this out eventually ? I see the graph in 'Crawl > Sitemaps > WebPages' is showing a consistent blue line of submitted pages but the red line of indexed pages drops to 0 for 3 - 5 days every 5 days or so. So fully indexed pages being reported for 5 day stretches then zero for a few days then indexed for another 5 days and so on ! ? Many ThanksDan0 -
What is the difference between 301 redirect to 404 vs just 404.
A bunch of pages on my site are set to 301 redirect to our 404 page. Intuitively, I feel like they should all just 404 from the page's url and not redirect to the 404 page. How do I explain to my developer that they should not redirects but should just 404? Is there much of a difference between the redirect first vs 404 first? Thanks!
Technical SEO | | gaytravel0 -
HTACCESS redirect vs. forwarding
I'm having trouble using htaccess redirect to redirect a subdomain to a new domain on a different server. Tech support at godaddy suggested I forward the subdomain. The subdomain has already been cached by google. Will forwarding in this way have the same affect (SEO wise) as an htaccess redirect??
Technical SEO | | triple90 -
CamelCase vs lowernodash
I'm in the process of reviewing on-site URL structure on a few sites, and I've run into something I can't decide between. I am forced to choose between the two examples: MediaRoom/CaseStudies.aspx (camel case) mediaroom/casestudies (all lower case, mashed, no dashes) I would personally rather see: media-room/case-studies/ However implementing the dashes would require manually re-writing about ~10,000 URLs. Implementing 301s from the existing structure to whatever I choose would be trivial, so there is no concern there. Given the choice between CamelCase and lower-mashed, which would you choose? Why?
Technical SEO | | MRCSearch0 -
Internal vs external blog and best way to set up
I have a client that has two domians registered - one uses www.keywordaustralia.com the other uses www.keywordaelaide.com He had already bought and used the first domain when he came to me I suggested the second as being worth buying as going for a more local keyword would be more appropriate. Now I have suggested to him that a blog would be a worthy use of the second domain and a way to build links to his site - however I am reading that as all links will be from the same site it wont be worth much in the long run and an internal blog is better as it means updated content on his site. should i use the second domain for blog, or just 301 the second domain to his first domain. Or is it viable to use the second domain as the blog and just set up an rss feed on his page ? Is there a way to have the second domain somehow 'linked' to his first domain with the blog so that google sees them as connected ? NOOBIE o_0
Technical SEO | | mamacassi0