Duplicate Content Penalties, International Sites
-
We're in the process of rolling out a new domestic (US) website design. If we copy the same theme/content to our International subsidiaries, would the duplicate content penalty still apply? All International sites would carry the Country specific domain, .co.uk, .eu, etc. This question is for English only content, I'm assuming translated content would not carry a penalty.
-
The consensus is that even though the content is the same, that it will rank locally using country specific domains. Can anyone provide examples where this is currently working?
-
I use rackspace | cloud sites. Is there a way I can request to have a domain pushed to a pool you have in UK or CA for example?
-
This Video from Matt Cutts will help too http://www.youtube.com/watch?v=Ets7nHOV1Yo
-
I asked this exact question to Greg Grothaus from Google at a conference back in 2009, and his answer was that duplicated content across different TLDs should'nt be something to be too concerned about. Realistically, search engines will decide which version of the site is more relevant for a particular geographic audience.
-
When it comes to English... I just advice that there are ways to make "different" a content. Just think to how different Brits and Americans write many words. Then all the classic International SEO tactics (links for the country your site have to rank, IP, address...)
Apart that, if you have the international sites with their corresponding Tld (.co.uk, .au, .in...) and you specify that the .com is for the USA Google, actually Google is quite good in noticing what site should have to rank for any country.
-
Yes. Translated content will not be considered a penalty as long as long as you launch the site on a domain with proper local TLD and add locally targeted content then you should be ok. Additionally, you may want to consider hosting the website with a local hosting provider.
This should also apply to an English language content modified for UK audience since UK English is technically considered different than the US. We have multiple English language international websites hosted on local TLDs that rank locally for the respective keywords.
Google has become much smarter in terms of detecting the geo local elements and it should serve the appropriate site on the SERP without causing duplicate content issues.
-
I think this sort of duplicate content is something that Google sees often. If you are copying everything exactly between domains I’d question if you need multiple sites. Presuming your content has country specific differences you’ll be ok.
Do not forget to register in Google webmaster tools your target market for each URL. Maybe build some new links in each local at the time of launch (press mentions, twitter shout outs etc).
Also you may want to consider the approach taken by Microsoft. One domain with country specific folders e.g.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Recurring events and duplicate content
Does anyone have tips on how to work in an event system to avoid duplicate content in regards to recurring events? How do I best utilize on-page optimization?
Technical SEO | | megan.helmer0 -
Duplicate Content Issues with Pagination
Hi Moz Community, We're an eCommerce site so we have a lot of pagination issues but we were able to fix them using the rel=next and rel=prev tags. However, our pages have an option to view 60 items or 180 items at a time. This is now causing duplicate content problems when for example page 2 of the 180 item view is the same as page 4 of the 60 item view. (URL examples below) Wondering if we should just add a canonical tag going to the the main view all page to every page in the paginated series to get ride of this issue. https://www.example.com/gifts/for-the-couple?view=all&n=180&p=2 https://www.example.com/gifts/for-the-couple?view=all&n=60&p=4 Thoughts, ideas or suggestions are welcome. Thanks
Technical SEO | | znotes0 -
Canonical Tags - Do they only apply to internal duplicate content?
Hi Moz, I've had a complaint from a company who we use a feed from to populate a restaurants product list.They are upset that on our products pages we have canonical tags linking back to ourselves. These are in place as we have international versions of the site. They believe because they are the original source of content we need to canonical back to them. Can I please confirm that canonical tags are purely an internal duplicate content strategy. Canonical isn't telling google that from all the content on the web that this is the original source. It's just saying that from the content on our domains, this is the original one that should be ranked. Is that correct? Furthermore, if we implemented a canonical tag linking to Best Restaurants it would de-index all of our restaurants listings and pages and pass the authority of these pages to their site. Is this correct? Thanks!
Technical SEO | | benj20341 -
Do mobile and desktop sites that pull content from the same source count as duplicate content?
We are about to launch a mobile site that pulls content from the same CMS, including metadata. They both have different top-level domains, however (www.abcd.com and www.m.abcd.com). How will this affect us in terms of search engine ranking?
Technical SEO | | ovenbird0 -
How to fix duplicate content errors with Go Daddy Site
I have a friend that uses a free GoDaddy template for his business website. I ran his site through Moz Crawl diagnostics, and wow - 395 errors. Mostly duplicate content and duplicate page title I dug further and found the site was doing this: URL: www.businessname.com/page1.php and the duplicate: businessname.com/page1.php Essentially, the duplicate is missing the www. And it does this 2 hundred times. How do I explain to him what is happening?
Technical SEO | | cschwartzel0 -
Duplicate Content - That Old Chestnut!!!
Hi Guys, Hope all is well, I have a question if I may? I have several articles which we have written and I want to try and find out the best way to post these but I have a number of concerns. I am hoping to use the content in an attempt to increase the Kudos of our site by providing quality content and hopefully receiving decent back links. 1. In terms of duplicate content should I only post it on one place or should I post the article in several places? Also where would you say the top 5 or 10 places would be? These are articles on XML, Social Media & Back Links. 2. Can I post the article on another blog or article directory and post it on my websites blog or is this a bad idea? A million thanks for any guidance. Kind Regards, C
Technical SEO | | fenwaymedia0 -
Duplicate Page Title & Content Penalty On Website Tonight Platform
I built my primary website on Website Tonight (WT) five years ago when I was a net newbie and I'm presently new to seomoz. The initial crawl indicated a problem with duplicate page title and duplicate content with my website home page in WT. It turns out that the WT platform makes you assign a file name to your homepage i.e: www.business.com/homepage.html that differs from the www.business.com that you want as your homepage url. Apparently the search engines are recognizing these identical pages as separate and duplicate. I know that the standard answer would be to just do a 301 redirect from the long file name to the short file name - end of story. But WT does not allow you to do 301 redirects and they also do not give you the ability to go into the htaccess file to fix this yourself manually. I spoke to the folks at WT tonight and they claim that they automatically do 301 redirects on the platform. But if this true then why am I getting the error message in seomoz? Does anyone know if this is a problem? If so, does anyone here have a fix? Thanks in advance. Sincerely - Bill in Denver
Technical SEO | | anxietycoach0 -
What to do about similar content getting penalized as duplicate?
We have hundreds of pages that are getting categorized as duplicate content because they are so similar. However, they are different content. Background is that they are names and when you click on each name it has it's own URL. What should we do? We can't canonical any of the pages because they are different names. Thank you!
Technical SEO | | bonnierSEO0