Is the same content posted under different international TLDs a problem?
-
Dear all,
I have a site which owns .be, .cn, .biz, .com.mx, .de, .us, .info, .net, .org and all run from the same server and have no difference in content i.e. .com.mx/our-services is the same as .com/our-services
Google webmaster help created a video that said multiple international TLDs, same content 'should be ok' - http://www.youtube.com/watch?v=Ets7nHOV1Yo - however, I would like confirmation from practitioners!
What is the best practice in this case? Considering none of the content is customised, should I create root level redirects to our .com, or leave as is?
Thanks!
Christian
-
Here is my two cents and take it for what it's worth. We took a .com site written in English and hosted in the U.S., duplicated the website content but redesigned the website. The duplicated content and new design was then hosted in Germany hoping to target UK and English searches in Europe, the website was a complete flop. We then took the duplicate content and had a professional translation service convert the exact content on the U.S. website to German and updated the new site and it started performing very well. We took the same approach in France and our other target markets; however, it didn't work in the UK.
-
Most responses I have seen for similiar questions is to redirect the country based TLDs to folders based on the country name, but that is usually if you have translations available.
If the content is exactly the same and not translated I would probably just redirect to the .com domain. This TLD gets the most respect, and most users know it regardless of country. Also if you use 301's it will help consolidate your link popularity under the one domain name for people who don't actually check your link and just link to you with example.de instead of example.com.
This way once you do put up translations (Which you should be doing if you get lots of international users) then you can switch the redirect to the appropriate folder at that point and still have your links consolidated.
Just my 2 cents
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Content Duplication - Zencart
Hi Guys !!! Based on crawler results, it shows that I have 188 duplicate content pages, out of which some are those in which I am not able to understand where the duplication is ??? The page created is unique. All the URL's are static, all titles, metat tags are unique. How do I remove this duplication !!! I am using Zencart as a platform. Thanks in advance for the help !!! 🙂
Technical SEO | | sidjain4you0 -
Why is Google's cache preview showing different version of webpage (i.e. not displaying content)
My URL is: http://www.fslocal.comRecently, we discovered Google's cached snapshots of our business listings look different from what's displayed to users. The main issue? Our content isn't displayed in cached results (although while the content isn't visible on the front-end of cached pages, the text can be found when you view the page source of that cached result).These listings are structured so everything is coded and contained within 1 page (e.g. http://www.fslocal.com/toronto/auto-vault-canada/). But even though the URL stays the same, we've created separate "pages" of content (e.g. "About," "Additional Info," "Contact," etc.) for each listing, and only 1 "page" of content will ever be displayed to the user at a time. This is controlled by JavaScript and using display:none in CSS. Why do our cached results look different? Why would our content not show up in Google's cache preview, even though the text can be found in the page source? Does it have to do with the way we're using display:none? Are there negative SEO effects with regards to how we're using it (i.e. we're employing it strictly for aesthetics, but is it possible Google thinks we're trying to hide text)? Google's Technical Guidelines recommends against using "fancy features such as JavaScript, cookies, session IDs, frames, DHTML, or Flash." If we were to separate those business listing "pages" into actual separate URLs (e.g. http://www.fslocal.com/toronto/auto-vault-canada/contact/ would be the "Contact" page), and employ static HTML code instead of complicated JavaScript, would that solve the problem? Any insight would be greatly appreciated.Thanks!
Technical SEO | | fslocal0 -
Would posting content into these sites be a good boost related to authority?
Hi, Would posting content into these sites be a good boost related to authority? Press releases PRWebPRLeapArticlesthetechscoop.netthecampussocialite.comtechi.combusiness2community.commediaite.comexaminer.commakezine.comhuffingtonpost.comAll these site charge to post is it worth?Thanks
Technical SEO | | mtthompsons1 -
Why has Google stopped indexing my content?
Mystery of the day! Back on December 28th, there was a 404 on the sitemap for my website. This lasted 2 days before I noticed and fixed. Since then, Google has not indexed my content. However, the majority of content prior to that date still shows up in the index. The website is http://www.indieshuffle.com/. Clues: Google reports no current issues in Webmaster tools Two reconsideration requests have returned "no manual action taken" When new posts are detected as "submitted" in the sitemap, they take 2-3 days to "index" Once "indexed," they cannot be found in search results unless I include url:indieshuffle.com The sitelinks that used to pop up under a basic search for "Indie Shuffle" are now gone I am using Yoast's SEO tool for Wordpress (and have been for years) Before December 28th, I was doing 90k impressions / 4.5k clicks After December 28th, I'm now doing 8k impressions / 1.3k clicks Ultimately, I'm at a loss for a possible explanation. Running an SEOMoz audit comes up with warnings about rel=canonical and a few broken links (which I've fixed in reaction to the report). I know these things often correct themselves, but two months have passed now, and it continues to get progressively worse. Thanks, Jason
Technical SEO | | indieshuffle0 -
WordPress post indexation speed
Has anyone noticed any increases in the length of time it takes for WP posts to get indexed by Google? I have a website with the following: domain.com - CMS with lots of pages/content blog.domain.com - subdomain for the blog using WP It's odd.. the pages on the main site are indexed almost immediately. The posts on the blog are taking between 2-5 days. The blog posts are all unique content, here's an example of a recent one: blog.looksfishy.co.uk/2013/three-rivers-angling/
Technical SEO | | edwardlewis0 -
Duplicate Content Issue
SEOMOZ is giving me a number of duplicate content warnings related to pages that have an email a friend and/or email when back in stock versions of a page. I thought I had those blocked via my robots.txt file which contains the following... Disallow: /EmailaFriend.asp Disallow: /Email_Me_When_Back_In_Stock.asp I had thought that the robot.txt file would solve this issue. Anyone have any ideas?
Technical SEO | | WaterSkis.com0 -
Checkout on different domain
Is it a bad SEO move to have a your checkout process on a separate domain instead of the main domain for a ecommerce site. There is no real content on the checkout pages and they are completely new pages that are not indexed in the search engines. Do to the backend architecture it is impossibe for us to have them on the same domain. An example is this page: http://www.printingforless.com/2/Brochure-Printing.html One option we've discussed to not pass page rank on to the checkout domain by iFraming all of the links to the checkout domain. We could also move the checkout process to a subdomain instead of a new domain. Please ignore the concerns with visitors security and conversion rate. Thanks!
Technical SEO | | PrintingForLess.com0 -
Duplicate content
This is just a quickie: On one of my campaigns in SEOmoz I have 151 duplicate page content issues! Ouch! On analysis the site in question has duplicated every URL with "en" e.g http://www.domainname.com/en/Fashion/Mulberry/SpringSummer-2010/ http://www.domainname.com/Fashion/Mulberry/SpringSummer-2010/ Personally my thoughts are that are rel = canonical will sort this issue, but before I ask our dev team to add this, and get various excuses why they can't I wanted to double check i am correct in my thinking? Thanks in advance for your time
Technical SEO | | Yozzer0