Do you think the SEs would see this as duplicate content?
-
Hi Mozzers!
I have a U.S. website and a Chinese version of that U.S. website.
The China site only gets direct and PPC traffic because the robots.txt file is disallowing the SEs from crawling it.
Question: If I added English sku descriptions and English content to the China site (which is also on our U.S. site), will the SEs penalize us for duplicate content even though the robots.txt file doesn’t allow them to see it?
I plan on translating the descriptions and content to Chinese at a later date, but wanted to ask if the above was an issue.
Thanks Mozzers!
-
Your robot text should play no part in this. You should leave the robots.txt however it should normally be for the website. Google knows that if you're serving a different country with a different IP along with a TLD that you will not be infringing on their rules regarding duplicate content because it is natural for somebody to have one site in one country and another site in another country and have the exact same content on those sites but therefore different target audiences so they're not gonna come up in the Google search rankings and they will both be good results for each country's audience.
Do not block anything with robots.txt that you do not need to block otherwise
Long story short if you're using robots.txt to block anything do not worry about that you can remove that block
-
Hi Thomas. Thanks again.
We have separate domains in separate countries--I think we're set there.
It is the question of having dupe content or not on the sites when one site has robots.txt turned "off".
-
To see the best practices on where to host for individual countries check out this whiteboard Friday
http://moz.com/blog/international-seo-where-to-host-and-how-to-target-whiteboard-friday
Have a answered all your questions?
All the best,
Thomas
-
Happy to be of help. It definitely will help you to have the domain hosted by a host inside whatever country you are targeting.
I'm glad to be of help sincerely,
Thomas
-
Thanks Thomas!
I should point out that the U.S. domain is hosted in the U.S. and the China domain is hosted in China.
Not sure if that makes a lick of difference.
-
If you have a Chinese version and a US version and the end with the different TDL you will have no penalties from Google you can keep the exact same content though you should obviously have the translation in place already.
You can do this without any worry of duplicate content whatsoever
For example I could have two sites example.co.uk and example.com and have them have identical content however I will not be penalized by Google whatsoever even those words are completely the same even in the same language because there for different countries.
Sincerely,
Thomas
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Hreflang tags and canonical tags - might be causing indexing and duplicate content issues
Hi, Let's say I have a site located at https://www.example.com, and also have subdirectories setup for different languages. For example: https://www.example.com/es_ES/ https://www.example.com/fr_FR/ https://www.example.com/it_IT/ My Spanish version currently has the following hreflang tags and canonical tag implemented: My robots.txt file is blocking all of my language subdirectories. For example: User-agent:* Disallow: /es_ES/ Disallow: /fr_FR/ Disallow: /it_IT/ This setup doesn't seem right. I don't think I should be blocking the language-specific subdirectories via robots.txt What are your thoughts? Does my hreflang tag and canonical tag implementation look correct to you? Should I be doing this differently? I would greatly appreciate your feedback and/or suggestions.
International SEO | | Avid_Demand0 -
Near-Duplicate Content
Hi, On my website, we are showcasing many products in both English and Spanish. We originally create each a product description in English, then we translate to Spanish. But sometimes, due to having numerous products, we don't translate to Spanish, and we just pull the English description on the Spanish page (so it has menus etc in Spanish, but the long Product Description in in English). English Example: http://www.viatrading.com/product.jhtm?id=34608
International SEO | | viatrading1
Spanish Example: http://www.viatrading.com/wholesale/product/TIGR-LN-APP/Ropa,-Relojes,-Gafas-y-Accesorios.html?cid=4 Could that be considered duplicated (or near-duplicated) content? For SEO, would it be better if the Spanish product page was redirected to the English one if not translated? Thank you,0 -
Website Domains, Geographical targeting and Duplicate Content
My colleagues in Holland have 2 websites. I've copied and pasted their question - my comments are at the bottom "www.ancoferwaldram.nl with NL, EN and FR language www.ancoferwaldram.com with only EN language The EN versions Google sees as “duplicate content” so we have to get rid of that. I think we better use 1 website: www.ancoferwaldram.com with NL, EN, FR and maybe other languages and deactivate www.ancoferwaldram.nl Or keep the www.ancoferwaldram.nl with only the NL language? Or keep the www.ancoferwaldram.nl with direct links to www.ancoferwaldram.com and no content?" The focus is to get the site to rank in Non-eu countries for export. So given the .nl has higher DA (though only about 15) would it be better to have seperate .fr, .be, .com sites for specific languages and geo targeting. Or would it be better to keep everything on the same site? If so which domain? i assume that the duplicate content can be resolved by stating which is the canonical version, once the domain strategy is resolved welcome any thoughts here. 🙂
International SEO | | Zippy-Bungle0 -
Duplicate content on multistore Magento website
Hello there, We run a Magento based e-commerce site in the UK for example: domain.com We are looking to launch a USA and Australian version of the website: usa.domain.com and au.domain.com Obviously the currency will be different and so will some of the content. Will we be penalised for having duplicate content across these 3 sites? (As some pages will be very similar or the same) Thanks Robert
International SEO | | roberthseo0 -
Duplicate content - news archive
Most of them are due to news items having more than 1 category – which is pretty normal.Also /us/blog, /uk/blog and /ca/blog are effectively the same page.None of them are actually duplicate content – just alternate URLs for the same pagehttp://www.fdmgroup.com/category/news/
International SEO | | fdmgroup0 -
Should I be deindexing pages with thin or weak content?
If I have pages that rank product categories by alphabetical order should I deindex those pages? Keeping in mind the pages do not have any content apart from product titles? For example: www.url.com/albums/a/ www.url.com/albums/b/ If I deindexed these pages would I lose any authority passed through internal linking?
International SEO | | Jonathan_Hatton0 -
Duplicate content international homepage
Hi, We have a website which is in english and dutch language. Our website has the following structure www.eurocottage.com:
International SEO | | Bram76
Dutch or English language ones the user has set his language in a cookie. www.eurocottage.com/nl/ :
Dutch language www.eurocottage.com/en/:
English language The .com and the eurocottage.com/nl/ and eurocottage.com have according to Google duplicate content because they are initial both in Dutch. What would be the best strategy to fix this problem? Thanks, Bram0 -
I have on site translated into several languages on different TLDs, .com, .de, .co.uk, .no, etc. Is this duplicate content?
Three of the sites are English (.co.uk, .com, .us) as well as foreign (.de, .no, etc.) - are these all seen as having duplicate content on every site? They're hosted under the same EpiServer backend system if this helps. But I am still copying and pasting content over each site, and translating where necessary, so I'm concerned this is indexed as being large amounts of duplicate content. Site traffic doesn't appear to be suffering but as I'm currently putting together new SEOs strategies, I want to cover this possibility. Any advice on ensuring the sites aren't penalised appreciated!
International SEO | | hurtigruten0