Multilingual site with untranslated content
-
We are developing a site that will have several languages.
There will be several thousand pages, the default language will be English. Several sections of the site will not be translated at first, so the main content will be in English but navigation/boilerplate will be translated.
We have hreflang alternate tags set up for each individual page pointing to each of the other languages, eg in the English version we have:
etc
In the spanish version, we would point to the french version and the english version etc.
My question is, is this sufficient to avoid a duplicate content penalty for google for the untranslated pages?
I am aware that from a user perspective, having untranslated content is bad, but in this case it is unavoidable at first.
-
Thanks for your comments Gianluca.
I think Google's guidelines are somewhat ambiguous. Here it does state that "if you're providing the same content to the same users on different URLs (for instance, if both example.de/ and example.com/de/ show German language content for users in Germany), you should pick a preferred version and redirect (or use the rel=canonical link element) appropriately."
https://support.google.com/webmasters/answer/182192?hl=en
I think you've explained it nicely though.
-
At first that would be fine.
Said that, this is a very specific case where you can use both hreflang and cross domain rel="canonical".
Remember that these two mark-up are totally independent one each other, though.
If you use them both, as I wrote replying to Yusuf, from one side you are telling Google that you want it to show a determined URL for a determined geo-targeted country/language, and from other side you are also telling Google that that geo-targeted URL is the exact copy of the canonical one.
What Google will do will be showing the geo-targeted URL in the SERPs, but with the Title and Meta Description of the canonical one.
One more thing, and this a strong reason for urging a complete translation in a short period of time:
if the content of the URL of the French site, for instance, is in English, you cannot put "fr-FR" in the hreflang, but "en-FR". This is a consequence: that the URL will tend to be shown only for English queries done in Google.fr, not for French queries... and that mean loosing a lot of traffic opportunities.
-
Yusuf,
I'm sorry but I've to correct you.
If two pages are in the same language, but they are targeting different countries (i.e.: USA and UK), even if the content is the same or substantially the same, then you not only can use the hreflang, but also you should use it in order to tell Google that one URL must be shown to US people and the other to UK ones.
Obviously, if you want you can always decide to use the cross domain rel="canonical" instead.
Remember, though, that in that case - if you are using the hreflang - that Google will show the snippets' components (title and meta description) of the canonical URL, even it will show the geotargeted URL. Instead, if you opted to not use the hreflang, people will see the canonical URL snippet (web address included).
-
Have you taken a look through the following :
https://support.google.com/webmasters/answer/182192?hl=en#1
https://sites.google.com/site/webmasterhelpforum/en/faq-internationalisation
"
Duplicate content and international sites
Websites that provide content for different regions and in different languages sometimes create content that is the same or similar but available on different URLs. This is generally not a problem as long as the content is for different users in different countries. While we strongly recommend that you provide unique content for each different group of users, we understand that this may not always be possible. There is generally no need to "hide" the duplicates by disallowing crawling in a robots.txt file or by using a "noindex" robots meta tag. However, if you're providing the same content to the same users on different URLs (for instance, if both
example.de/
andexample.com/de/
show German language content for users in Germany), you should pick a preferred version and redirect (or use the rel=canonical link element) appropriately. In addition, you should follow the guidelines on rel-alternate-hreflang to make sure that the correct language or regional URL is served to searchers." -
Hi Jorge
The rel="alternate" hreflang="x" tag is not suitable for pages that are in the same language as these are essentially duplicates rather than alternative language versions.
I'd use the rel="canonical" tag to point to the main page until the translations of those pages are available.
Webmaster Tools should allow you to see any issues.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to fix thin content issue?
Hello! I've checked my website via Moz and received "thin content" issue: "Your page is considered to have "thin content" if it has less than 50 words" But I definitely know that we have 5 text blocks with unique content, each block consist of more than 50 words. Do you have any ideas what may cause this issue? Thanks in advance, Yana
On-Page Optimization | | yanamazault0 -
To avoid the duplicate content issue I have created new urls for that specific site I am posting to and redirecting that url to the original on my site. Is this the right way to do it?
I am trying to avoid the duplicate content issue by creating new urls and redirecting them to the original url. Is this the proper way of going about it?
On-Page Optimization | | yagobi210 -
Duplicate Content
I have a question about duplicate content. (auto generated text).
On-Page Optimization | | affigroup
Will google consider page 1 and page 2 as duplicate content? Page 1. You will find all the Amazon coupon codes and Amazon discount codes currently available listed below, if Amazon doesn't currently have any coupons available you may want to check for Amazon deals or find related coupon codes or promotional codes for similar online stores selling the same products as amazon.
We always have the latest coupon codes for Amazon which are updated daily, so if you can't find any Amazon coupons here then you won't find them anywhere else.
Shop online today at Amazon, and take advantage of the coupon codes that Amazon currently has on offer, these coupon codes, offer codes, and promo codes for Amazon may never be available again. Page 2. You will find all the Target coupon codes and Target discount codes currently available listed below, if Target doesn't currently have any coupons available you may want to check for Target deals or find related coupon codes or promotional codes for similar online stores selling the same products as Target.
We always have the latest coupon codes for Target which are updated daily, so if you can't find any Target coupons here then you won't find them anywhere else.
Shop online today at Target, and take advantage of the coupon codes that Target currently has on offer, these coupon codes, offer codes, and promo codes for Target may never be available again.0 -
Need help ranking my site
Hi, Can anyone help me out? I am trying to get this site ranked for "Villa General Belgrano". It was on the first page of Google and then it disappeared. Did I over optimize the anchor text? http://www.opensiteexplorer.org/anchors?site=www.lawebdelvalle.com.ar
On-Page Optimization | | Carla_Dawson0 -
tagged as duplicate content?
Hello folks, I'm new to SEOmoz . I was looking at our Crawl Diagnostics and found that some of our blog posts that have been commented on were tagged as duplicate content. For example: http://thankyouregistry.com/blog/remarriages-and-gift-registries/ http://thankyouregistry.com/blog/remarriages-and-gift-registries/comment-page-1/ I'm unsure how to fix these, so any ideas would be appreciated. Thanks a lot!
On-Page Optimization | | GiftReg0 -
Question Regarding Site Structure
I have a quick question regarding site structure that I hope some of you guys could share your opinion on. I watched a white board friday from Rand a little while back where he explains that you need to try and make the site structure as flat as possible. He was saying try having no more that 3 links from the home page to get to the desired location. My question is this. I am looking at a site that has a pretty complex structure that I am trying to clear up as much as possible without making any of there rankings suffer. So they have www.domian.com/general-category/district/town/ and sometimes www.domian.com/general-category/district/town/item-specifics Now i know it is not good as it is, but they are dubious about changing too much as they have some serious traffic coming to the site. But, my question is that all the pages can be found from the home page through the menus/sub-menus. But do these count as a direct link from the home page. Also a problem is that because of this mozbot has detected that there are too many links from the home page and suggested that it should be below 200. But should I make these menu links no index or no follow. Obviously, by doing this, if the link does count as direct from the home page it wont after doing this. Thanks Jenson
On-Page Optimization | | jensonseo0 -
Is content aggregation good SEO?
I didn't see this topic specifically addressed here: what's the current thinking on using content aggregation for SEO purposes? I'll use flavors.me as an example. Flavors.me lets you set up a domain that pulls in content from a variety of services (Twitter, YouTube, Flickr, RSS, etc.). There's also a limited ability to publish unique content as well. So let's say that we've got MyDomain.com set up, and most of the content is being drawn in from other services. So there's blog posts from WordPress.com, videos from YouTube, a photo gallery from Flickr, etc. How would Google look at this scenario? Is MyDomain.com simply scraped content from the other (more authoritative) sources? Is the aggregated content perceived to "belong" to MyDomain.com or not? And most importantly, if you're aggregating a lot of content related to Topic X, will this content aggregation help MyDomain.com rank for Topic X? Looking forward to the community's thoughts. Thanks!
On-Page Optimization | | GOODSIR0 -
Geo-targeted content and SEO?
I am wondering, what effect does geo-targeted "cookie cutter" content have on SEO. For example, one might have a list of "Top US Comedians", which appears as "Top UK Comedians" for users from the United Kingdom. The data would be populated with information from a database in both cases, but would be completely different for each region, with the exception of a few words. Is this essentially giving Google's (US-based) crawler different content to users? I know that plenty of sites do it, but is it legitimate? Would it be better to redirect to a unique page, based on location, rather than change the content of one static page? I know what the logical SEO answer is here, but even some of the big players use the "wrong" tactic. I am very interested to hear your thoughts.
On-Page Optimization | | HalogenDigital0