Is the same content posted under different international TLDs a problem?
-
Dear all,
I have a site which owns .be, .cn, .biz, .com.mx, .de, .us, .info, .net, .org and all run from the same server and have no difference in content i.e. .com.mx/our-services is the same as .com/our-services
Google webmaster help created a video that said multiple international TLDs, same content 'should be ok' - http://www.youtube.com/watch?v=Ets7nHOV1Yo - however, I would like confirmation from practitioners!
What is the best practice in this case? Considering none of the content is customised, should I create root level redirects to our .com, or leave as is?
Thanks!
Christian
-
Here is my two cents and take it for what it's worth. We took a .com site written in English and hosted in the U.S., duplicated the website content but redesigned the website. The duplicated content and new design was then hosted in Germany hoping to target UK and English searches in Europe, the website was a complete flop. We then took the duplicate content and had a professional translation service convert the exact content on the U.S. website to German and updated the new site and it started performing very well. We took the same approach in France and our other target markets; however, it didn't work in the UK.
-
Most responses I have seen for similiar questions is to redirect the country based TLDs to folders based on the country name, but that is usually if you have translations available.
If the content is exactly the same and not translated I would probably just redirect to the .com domain. This TLD gets the most respect, and most users know it regardless of country. Also if you use 301's it will help consolidate your link popularity under the one domain name for people who don't actually check your link and just link to you with example.de instead of example.com.
This way once you do put up translations (Which you should be doing if you get lots of international users) then you can switch the redirect to the appropriate folder at that point and still have your links consolidated.
Just my 2 cents
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Geo Targeting Content Question
Hi, all First question here so be gentle, please My question is around geo targeted dynamic content; at the moment we run a .com domain with, for example, an article about running headphones and then at the end - taking up about 40% of the content - is a review of some people can buy, with affiliate links. We have a .co.uk site with the same page about running headphones and then 10 headphones for the UK market. Note: rel alternative is used on the pages to point to each other, therefore (hopefully) removing duplicate content issues. This design works well but it involves having to build links to two pages, in the case of this example. What we are thinking of doing is to just use the .com domain and having the product page of the page served dynamically, ie, people in the UK see UK products and people in US see US products. What are people's thoughts on this technique, please? From my understanding, it wouldn't be any problem with Google for cloaking etc because a googlebot and a human from the same country will see the same content. The site is made in Wordpress and has <....html lang="en-US"> (for the .com) in the header. Would this cause problems for the page ranking in the UK etc? The ultimate goal of doing this would be to reduce link building efforts by halving the number of pages which links would have to be built for. I welcome any feedback. Many thanks
Technical SEO | | TheMuffinMan0 -
Does duplicate content not concern Rand?
Hello all, I'm a new SEOer and I'm currently trying to navigate the layman's minefield that is trying to understand duplicate content issues in as best I can. I'm working on a website at the moment where there's a duplicate content issue with blog archives/categories/tags etc. I was planning to beat this by implementing a noindex meta tag on those pages where there are duplicate content issues. Before I go ahead with this I thought: "Hey, these Moz guys seem to know what they're doing! What would Rand do?" Blogs on the website in question appear in full and in date order relating to the tag/category/what-have-you creating the duplicate content problem. Much like Rand's blog here at Moz - I thought I'd have a look at the source code to see how it was dealt with. My amateur eyes could find nothing to help answer this question: E.g. Both the following URLs appear in SERPs (using site:moz,com and very targeted keywords, but they're there): https://moz.com/rand/does-making-a-website-mobile-friendly-have-a-universally-positive-impact-on-mobile-traffic/ https://moz.com/rand/category/moz/ Both pages have a rel="canonical" pointing to themselves. I can understand why he wouldn't be fussed about the category not ranking, but the blog? Is this not having a negative effect? I'm just a little confused as there are so many conflicting "best practice" tips out there - and now after digging around in the source code on Rand's blog I'm more confused than ever! Any help much appreciated, Thanks
Technical SEO | | sbridle1 -
Internal link structure, find out if there are any internal links to this page
When i use this url in open site explorer it says that there are no internal links:
Technical SEO | | wilcoXXL
http://goo.gl/d2s6tJ
Page Authority is also 1, it should be higher of there are any internal links to it right? But i am very sure there are links to this url on my website. For example on this URL:
http://goo.gl/ucixRH How certain can i be of this? Because if i can be very certain, than we have a internal linkstructure problem on our entire site i believe.0 -
Wordpress: Should your blog posts be noindex?
Wordpress defaults all blog posts to no index/nofollow Is this how it should be handled? I understand the nofollow from the page.com/blog to the page.com/blog/blogtitle But why noindex? We have Yoast installed and this is the default.
Technical SEO | | cschwartzel0 -
Duplicate Content
SEOmoz is reporting duplicate content for 2000 of my pages. For example, these are reported as duplicate content: http://curatorseye.com/Name=“Holster-Atlas”---Used-by-British-Officers-in-the-Revolution&Item=4158
Technical SEO | | jplill
http://curatorseye.com/Name=âHolster-Atlasâ---Used-by-British-Officers-in-the-Revolution&Item=4158 The actual link on the site is http://www.curatorseye.com/Name=“Holster-Atlas”---Used-by-British-Officers-in-the-Revolution&Item=4158 Any insight on how to fix this? I'm not sure where the second version of the URL is coming from. Thanks,
Janet0 -
Duplicate page content
Hello, The pro dashboard crawler bot thing that you get here reports the mydomain.com and mydomain.com/index.htm as duplicate pages. Is this a problem? If so how do I fix it? Thanks Ian
Technical SEO | | jwdl0 -
Duplicate Content Issue with
Hello fellow Moz'rs! I'll get straight to the point here - The issue, which is shown in the attached image, is that for every URL ending in /blog/category/name, it has a duplicate page of /blog/category/name/?p=contactus. Also, its worth nothing that the ?p=contact us are not in the SERPs but were crawled by SEOMoz and they are live and duplicate. We are using Pinnacle cart. Is there a way to just stop the crawlers from ?p=contactus or? Thank you all and happy rankings, James
Technical SEO | | JamesPiper0 -
Duplicate content
I have to sentences that I want to optimize to different pages for. sentence number one is travel to ibiza by boat sentence number to is travel to ibiza by ferry My question is, can I have the same content on both pages exept for the keywords or will Google treat that as duplicate content and punish me? And If yes, where goes the limit/border for duplicate content?
Technical SEO | | stlastla0