Do you think the SEs would see this as duplicate content?
-
Hi Mozzers!
I have a U.S. website and a Chinese version of that U.S. website.
The China site only gets direct and PPC traffic because the robots.txt file is disallowing the SEs from crawling it.
Question: If I added English sku descriptions and English content to the China site (which is also on our U.S. site), will the SEs penalize us for duplicate content even though the robots.txt file doesn’t allow them to see it?
I plan on translating the descriptions and content to Chinese at a later date, but wanted to ask if the above was an issue.
Thanks Mozzers!
-
Your robot text should play no part in this. You should leave the robots.txt however it should normally be for the website. Google knows that if you're serving a different country with a different IP along with a TLD that you will not be infringing on their rules regarding duplicate content because it is natural for somebody to have one site in one country and another site in another country and have the exact same content on those sites but therefore different target audiences so they're not gonna come up in the Google search rankings and they will both be good results for each country's audience.
Do not block anything with robots.txt that you do not need to block otherwise
Long story short if you're using robots.txt to block anything do not worry about that you can remove that block
-
Hi Thomas. Thanks again.
We have separate domains in separate countries--I think we're set there.
It is the question of having dupe content or not on the sites when one site has robots.txt turned "off".
-
To see the best practices on where to host for individual countries check out this whiteboard Friday
http://moz.com/blog/international-seo-where-to-host-and-how-to-target-whiteboard-friday
Have a answered all your questions?
All the best,
Thomas
-
Happy to be of help. It definitely will help you to have the domain hosted by a host inside whatever country you are targeting.
I'm glad to be of help sincerely,
Thomas
-
Thanks Thomas!
I should point out that the U.S. domain is hosted in the U.S. and the China domain is hosted in China.
Not sure if that makes a lick of difference.
-
If you have a Chinese version and a US version and the end with the different TDL you will have no penalties from Google you can keep the exact same content though you should obviously have the translation in place already.
You can do this without any worry of duplicate content whatsoever
For example I could have two sites example.co.uk and example.com and have them have identical content however I will not be penalized by Google whatsoever even those words are completely the same even in the same language because there for different countries.
Sincerely,
Thomas
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
International Sites and Duplicate Content
Hello, I am working on a project where have some doubts regarding the structure of international sites and multi languages.Website is in the fashion industry. I think is a common problem for this industry. Website is translated in 5 languages and sell in 21 countries. As you can imagine this create a huge number of urls, so much that with ScreamingFrog I cant even complete the crawling. Perhaps the UK site is visible in all those versions http://www.MyDomain.com/en/GB/ http://www.MyDomain.com/it/GB/ http://www.MyDomain.com/fr/GB/ http://www.MyDomain.com/de/GB/ http://www.MyDomain.com/es/GB/ Obviously for SEO only the first version is important One other example, the French site is available in 5 languages and again... http://www.MyDomain.com/fr/FR/ http://www.MyDomain.com/en/FR/ http://www.MyDomain.com/it/FR/ http://www.MyDomain.com/de/FR/ http://www.MyDomain.com/es/FR/ And so on...this is creating 3 issues mainly: Endless crawling - with crawlers not focusing on most important pages Duplication of content Wrong GEO urls ranking in Google I have already implemented href lang but didn't noticed any improvements. Therefore my question is Should I exclude with "robots.txt" and "no index" the non appropriate targeting? Perhaps for UK leave crawable just English version i.e. http://www.MyDomain.com/en/GB/, for France just the French version http://www.MyDomain.com/fr/FR/ and so on What I would like to get doing this is to have the crawlers more focused on the important SEO pages, avoid content duplication and wrong urls rankings on local Google Please comment
International SEO | | guidoampollini0 -
Website Domains, Geographical targeting and Duplicate Content
My colleagues in Holland have 2 websites. I've copied and pasted their question - my comments are at the bottom "www.ancoferwaldram.nl with NL, EN and FR language www.ancoferwaldram.com with only EN language The EN versions Google sees as “duplicate content” so we have to get rid of that. I think we better use 1 website: www.ancoferwaldram.com with NL, EN, FR and maybe other languages and deactivate www.ancoferwaldram.nl Or keep the www.ancoferwaldram.nl with only the NL language? Or keep the www.ancoferwaldram.nl with direct links to www.ancoferwaldram.com and no content?" The focus is to get the site to rank in Non-eu countries for export. So given the .nl has higher DA (though only about 15) would it be better to have seperate .fr, .be, .com sites for specific languages and geo targeting. Or would it be better to keep everything on the same site? If so which domain? i assume that the duplicate content can be resolved by stating which is the canonical version, once the domain strategy is resolved welcome any thoughts here. 🙂
International SEO | | Zippy-Bungle0 -
I have on site translated into several languages on different TLDs, .com, .de, .co.uk, .no, etc. Is this duplicate content?
Three of the sites are English (.co.uk, .com, .us) as well as foreign (.de, .no, etc.) - are these all seen as having duplicate content on every site? They're hosted under the same EpiServer backend system if this helps. But I am still copying and pasting content over each site, and translating where necessary, so I'm concerned this is indexed as being large amounts of duplicate content. Site traffic doesn't appear to be suffering but as I'm currently putting together new SEOs strategies, I want to cover this possibility. Any advice on ensuring the sites aren't penalised appreciated!
International SEO | | hurtigruten0 -
Will Google punish me cuz my websites content are almost the same?
If I have almost the same contents for my three e-commerce websites, say A.com,B.uk,C.ca. They're promoted in US, GB, Canada which are all English speaking. Will my site be punished because they're almost the same to Google?
International SEO | | SquallPersun0 -
How do you see Google results specific to location?
We run a Canadian website and are interested in seeing what SERPs look like from specific postal codes. Is there any way to manipulate Google to think our IP address comes from another location? Thanks!
International SEO | | ClaytonKendall0 -
Is duplicate content a concern across multiple CCTLDs?
Looking for experienced feedback on a new client challenge. Multiple pages of content in the English language are replicated across multiple CCTLDs in addition to the .com address we're working with. Is duplicate content a concern in this case? What measures are recommended to help preserve their North American search inclusion while not serving as a detriment to external (European/Asian CCTLDs) properties aimed for other geos/languages? EDIT: After posting, this was read. Any thoughts? http://searchengineland.com/google-webmaster-tools-provides-details-on-duplicate-content-across-domains-99246
International SEO | | eMagineSEO0 -
Will duplicate content across international domains have a negative affect on our SERP
Our corporate website www.tryten.com showcases/sells our products to all of our customers. We have Canadian and UK based customers and would like to duplicate our website onto .ca and .co.uk domains respectively to better service them. These sites will showcase the same products, only the price and ship from locations will change. Also, the phone numbers and contact info will be altered. The sites will all be on one server. On each of the sites there will be a country selector which will take you to the appropriate domain for the country selected. Will doing this negatively affect our rankings in the US, UK and Canada?
International SEO | | tryten0 -
Internationally targetted subdomains and Duplicate content
A client has a site they'd like to translated into French, not for the french market but for french speaking countries. My research tells me the best way to implement this for this particular client is to create subfolders for each country. For ease of implementation I’ve decided against ccTLD’s and Sub Domains. So for example… I'll create www.website.com/mr/ for Mauritania and in GWT set this to target Mauritania. Excellent so far. But then I need to build another sub folder for Morocco. I'll then create www.website.com/ma/ for Morocco and in GWT set this to target Morocco. Now the content on these two sub folders will be exactly the same and I’m thinking about doing this for all French speaking African countries. It would be nice to use www.website.com/fr/ but in GWT you can only set one Target country. Duplicate content issues arise and my fear of perturbing the almighty Google becomes a possibility. My research indicates that I should simply canonical back to the page I want indexed. But I want them both to be indexed surely!? I therefore decided to share my situation with my fellow SEO’s to see if I’m being stupid or missing something simple both a distinct possibility!
International SEO | | eazytiger0