Duplicate Content Regarding Translated Pages
-
If we have one page in English, and another that is translated into Spanish, does google consider that duplicate content? I don't know if having something in a different language makes it different or if it will get flagged.
Thanks,
Ruben
-
I was hoping to learn More about Portafina. For content management the duplicate content.I was not sure about that but this helped me a lot.
-
Hey there. I asked a question that has some similarities, figured I would share that here.
https://moz.com/community/q/competitor-has-same-site-with-multiple-languages
-
It will not be considered duplicate content. However, you should use hreflang markup so that google knows that 2 versions of a page are actually the same (just translated) and show the correct language to searchers.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
"Duplicate without user-selected canonical” - impact to SERPs
Hello, we are facing some issues on our project and we would like to get some advice. Scenario
International SEO | | Alex_Pisa
We run several websites (www.brandName.com, www.brandName.be, www.brandName.ch, etc..) all in French language . All sites have nearly the same content & structure, only minor text (some headings and phone numbers due to different countries are different). There are many good quality pages, but again they are the same over all domains. Goal
We want local domains (be, ch, fr, etc.) to appear in SERPs and also comply with Google policy of local language variants and/or canonical links. Current solution
Currently we don’t use canonicals, instead we use rel="alternate" hreflang="x-default": <link rel="alternate" hreflang="fr-BE" href="https://www.brandName.be/" /> <link rel="alternate" hreflang="fr-CA" href="https://www.brandName.ca/" /> <link rel="alternate" hreflang="fr-CH" href="https://www.brandName.ch/" /> <link rel="alternate" hreflang="fr-FR" href="https://www.brandName.fr/" /> <link rel="alternate" hreflang="fr-LU" href="https://www.brandName.lu/" /> <link rel="alternate" hreflang="x-default" href="https://www.brandName.com/" /> Issue
After Googlebot crawled the websites we see lot of “Duplicate without user-selected canonical” in Coverage/Excluded report (Google Search Console) for most domains. When we inspect some of those URLs we can see Google has decided that canonical URL points to (example): User-declared canonical: None
Google-selected canonical: …same page, but on a different domain Strange is that even those URLs are on Google and can be found in SERPs. Obviously Google doesn’t know what to make of it. We noticed many websites in the same scenario use a self-referencing approach which is not really “kosher” - we are afraid if we use the same approach we can get penalized by Google. Question: What do you suggest to fix the “Duplicate without user-selected canonical” in our scenario? Any suggestions/ideas appreciated, thanks. Regards.0 -
Duplicate store (subdomain) not ranking
I have a store ( www.grocare.com ) and I made a duplicate store recently ( in.grocare.com ) for a different region. Both have different currencies and target different regions. I even targeted the new store ( in.grocare.com ) to that particular country in google search console. They both have different href lang tags to mark different regions too. Now its been a month since this has been done. But the new store is not ranking in the region. The old one is still ranking and I have to redirect the traffic from old to new based on IP.
International SEO | | grocare
I thought making a new store and targeting specifically would help with rankings. Am i doing something wrong here?0 -
International Sites and Duplicate Content
Hello, I am working on a project where have some doubts regarding the structure of international sites and multi languages.Website is in the fashion industry. I think is a common problem for this industry. Website is translated in 5 languages and sell in 21 countries. As you can imagine this create a huge number of urls, so much that with ScreamingFrog I cant even complete the crawling. Perhaps the UK site is visible in all those versions http://www.MyDomain.com/en/GB/ http://www.MyDomain.com/it/GB/ http://www.MyDomain.com/fr/GB/ http://www.MyDomain.com/de/GB/ http://www.MyDomain.com/es/GB/ Obviously for SEO only the first version is important One other example, the French site is available in 5 languages and again... http://www.MyDomain.com/fr/FR/ http://www.MyDomain.com/en/FR/ http://www.MyDomain.com/it/FR/ http://www.MyDomain.com/de/FR/ http://www.MyDomain.com/es/FR/ And so on...this is creating 3 issues mainly: Endless crawling - with crawlers not focusing on most important pages Duplication of content Wrong GEO urls ranking in Google I have already implemented href lang but didn't noticed any improvements. Therefore my question is Should I exclude with "robots.txt" and "no index" the non appropriate targeting? Perhaps for UK leave crawable just English version i.e. http://www.MyDomain.com/en/GB/, for France just the French version http://www.MyDomain.com/fr/FR/ and so on What I would like to get doing this is to have the crawlers more focused on the important SEO pages, avoid content duplication and wrong urls rankings on local Google Please comment
International SEO | | guidoampollini0 -
Is there any reason to get a massive decrease on indexed pages?
Hi, I'm helping on SEO for a big e-commerce in LatAm and one thing we've experienced during the last months is that our search traffic had reduced and the indexed pages had decreased in a terrible way. The site had over 2 Million indexed pages (which was way too much, since we believe that around 10k would be more than enough to hold the over 6K SKUs) but now this number has decreased to less than 3K in less than 2 months. I've also noticed that most of the results in which the site is still appearing are .pdf or .doc files but not actual content on the website. I've checked the following: Robots (there is no block, you can see that on the image as well) Webmaster Tools Penalties Duplicated content I don't know where else to look for. Can anyone help? Thanks in advance! cpLwX1X
International SEO | | mat-relevance0 -
Showing different content according to different geo-locations on same URL
We would like our website to show different content according to different Geo-locations (but in the same language). For example, if www.mywebsite.com is accessed from the US, it would show text (in English) appealing to North Americans, but, if accessed from Japan, it would show text (also in English) that appeals more to Japanese people. In the Middle East, we would like the website to show different images than those shown in the US and Asia. Our main concern is that we would like to keep the same URL. How will Google index these pages? Will it index the www.mywebsite.com (Japan version) in its Asia archives and the www.mywebsite.com (US version) in its North American archives? Will Google penalise us for showing different content across Geo-locations on the same URL? What if a URL is meant to show content only in Japan? Are there any other issues that we should be looking out for? Kindest Regards L.B.
International SEO | | seoec0 -
Content in different languages
HI all, I need some advice about displaying content in different languages. Currently I 301 to the correct locale based on IP. e.g. German 301s from site.com to site.com/de En 301s from site.com to site.com/en Is this the best way or would it just be better to change the content based on browser and keep the URL the same? I have href="/hr" hreflang="hr" rel="alternate" /> tags implemented for all locales on site Thanks
International SEO | | Sayers0 -
Non US site pages indexed in US Google search
Hi, We are having a global site wide issue with non US site pages being indexed by Google and served up in US search results. Conversley, we have US en pages showing in the Japan Google search results. We currently us IP detect to direct users to the correct regional site but it isn't effective if the users are entering through an incorrect regional page. At the top of each or our pages we have a drop down menu to allow users to manually select their preferred region. Is it possible that Google Bot is crawling these links and indexing these other regional pages as US and not detecting it due to our URL structure? Below are examples of two of our URLs for reference - one from Canada, the other from the US /ca/en/prod4130078/2500058/catalog50008/ /us/en/prod4130078/2500058/catalog20038/ If that is, in fact, what is happening, would setting the links within the drop down to 'no follow' address the problem? Thank you. Angie
International SEO | | Corel0 -
Is duplicate content a concern across multiple CCTLDs?
Looking for experienced feedback on a new client challenge. Multiple pages of content in the English language are replicated across multiple CCTLDs in addition to the .com address we're working with. Is duplicate content a concern in this case? What measures are recommended to help preserve their North American search inclusion while not serving as a detriment to external (European/Asian CCTLDs) properties aimed for other geos/languages? EDIT: After posting, this was read. Any thoughts? http://searchengineland.com/google-webmaster-tools-provides-details-on-duplicate-content-across-domains-99246
International SEO | | eMagineSEO0