Duplicate Content Penalties, International Sites
-
We're in the process of rolling out a new domestic (US) website design. If we copy the same theme/content to our International subsidiaries, would the duplicate content penalty still apply? All International sites would carry the Country specific domain, .co.uk, .eu, etc. This question is for English only content, I'm assuming translated content would not carry a penalty.
-
The consensus is that even though the content is the same, that it will rank locally using country specific domains. Can anyone provide examples where this is currently working?
-
I use rackspace | cloud sites. Is there a way I can request to have a domain pushed to a pool you have in UK or CA for example?
-
This Video from Matt Cutts will help too http://www.youtube.com/watch?v=Ets7nHOV1Yo
-
I asked this exact question to Greg Grothaus from Google at a conference back in 2009, and his answer was that duplicated content across different TLDs should'nt be something to be too concerned about. Realistically, search engines will decide which version of the site is more relevant for a particular geographic audience.
-
When it comes to English... I just advice that there are ways to make "different" a content. Just think to how different Brits and Americans write many words. Then all the classic International SEO tactics (links for the country your site have to rank, IP, address...)
Apart that, if you have the international sites with their corresponding Tld (.co.uk, .au, .in...) and you specify that the .com is for the USA Google, actually Google is quite good in noticing what site should have to rank for any country.
-
Yes. Translated content will not be considered a penalty as long as long as you launch the site on a domain with proper local TLD and add locally targeted content then you should be ok. Additionally, you may want to consider hosting the website with a local hosting provider.
This should also apply to an English language content modified for UK audience since UK English is technically considered different than the US. We have multiple English language international websites hosted on local TLDs that rank locally for the respective keywords.
Google has become much smarter in terms of detecting the geo local elements and it should serve the appropriate site on the SERP without causing duplicate content issues.
-
I think this sort of duplicate content is something that Google sees often. If you are copying everything exactly between domains I’d question if you need multiple sites. Presuming your content has country specific differences you’ll be ok.
Do not forget to register in Google webmaster tools your target market for each URL. Maybe build some new links in each local at the time of launch (press mentions, twitter shout outs etc).
Also you may want to consider the approach taken by Microsoft. One domain with country specific folders e.g.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Are on-site content carousel bad for SEO?
Hi, I didn't find an answer to my question in the Forum. I attached an example of content carousel, this is what I'm talking about. I understand that Google has no problem anymore with tabbed contents and accordeons (collapsible contents). But now I'm wondering about textual carousels. I'm not talking about an image slider, I'm talking about texts. Is text carousel harder to read for Google than plain text or tabs? Of course, i'm not talking about a carousel using Flash. Let's say the code is proper... Thanks for your help. spfra5
Technical SEO | | Alviau0 -
Moving content from 1 site to another
Hi Moz'ers I have recently taken over the company I work for (www.prendas.co.uk) and want to migrate approx 200 pages from my hobby site to my company's eCommernce website on Shopify Plus. The content (text/photos) are highly relevant to the company, is unique & written by me and will hopefully generate some sales. I have a plan to gradually move them all over (I intend on doing 2-5/day) into a specific blog section to house them all. I'll then create an Apache perm. redirect to the old site too. Source: cycling-jersey-collection.com D/A 21
Technical SEO | | andystorey
Destination: prendas.co.uk D/A 35 The plan is to eventually shut the old site down. Does anybody have any best practice/and/or recommendations? Thanks in advance. Andy0 -
Duplicate Content Issue WWW and Non WWW
One of my sites got hit with duplicate content a while ago because Google seemed to be considering hhtp, https, www, and non ww versions of the site all different sites. We thought we fixed it, but for some reason https://www and just https:// are giving us duplicate content again. I can't seem to figure out why it keeps doing this. The url is https://bandsonabudget.com if any of you want to see if you can figure out why I am still having this issue.
Technical SEO | | Michael4g1 -
Would Google Call These Pages Duplicate Content?
Our Web store, http://www.audiobooksonline.com/index.html, has struggled with duplicate content issues for some time. One aspect of duplicate content is a page like this: http://www.audiobooksonline.com/out-of-publication-audio-books-book-audiobook-audiobooks.html. When an audio book title goes out-of-publication we keep the page at our store and display a http://www.audiobooksonline.com/out-of-publication-audio-books-book-audiobook-audiobooks.html whenever a visitor attempts to visit a specific title that is OOP. There are several thousand OOP pages. Would Google consider these OOP pages duplicate content?
Technical SEO | | lbohen0 -
How do I deal with Duplicate content?
Hi, I'm trying SEOMOZ and its saying that i've got loads of duplicate content. We provide phone numbers for cities all over the world, so have pages like this... https://www.keshercommunications.com/Romaniavoipnumbers.html https://www.keshercommunications.com/Icelandvoipnumbers.html etc etc. One for every country. The question is, how do I create pages for each one without it showing up as duplicate content? Each page is generated by the server, but Its impossible to write unique text for each one. Also, the competition seem to have done the same but google is listing all their pages when you search for 'DID Numbers. Look for DIDWW or MyDivert.
Technical SEO | | DanFromUK0 -
Duplicate content
I have two page, where the second makes a duplicate content from the first Example:www.mysite.com/mypagewww.mysite.com/mysecondpageIf i insert still making duplicate content?Best regards,Wendel
Technical SEO | | peopleinteractive0 -
If two websites pull the same content from the same source in a CMS, does it count as duplicate content?
I have a client who wants to publish the same information about a hotel (summary, bullet list of amenities, roughly 200 words + images) to two different websites that they own. One is their main company website where the goal is booking, the other is a special program where that hotel is featured as an option for booking under this special promotion. Both websites are pulling the same content file from a centralized CMS, but they are different domains. My question is two fold: • To a search engine does this count as duplicate content? • If it does, is there a way to configure the publishing of this content to avoid SEO penalties (such as a feed of content to the microsite, etc.) or should the content be written uniquely from one site to the next? Any help you can offer would be greatly appreciated.
Technical SEO | | HeadwatersContent0 -
Duplicate content and URL's
Hi Guys, Hope you are all well. Just a quick question which you will find nice and easy 🙂 I am just about to work through duplicate content pages and URL changes. Firstly, With the duplicate content issue i am finding the seo friendly URL i would normally direct to in some cases has less links, authority and root domain to it than some of the unseo friendly URL's. will this harm me if i still 301 redirect them to the seo friendly URL. Also, With the url changed it is going to be a huge job to change all the url so they are friendly and the CMS system is poor. Is there a better way of doing this? It has been suggested that we create a new webpage with a friendly URL and redirect all the pages to that. Will this lose all the weight as it will be a brand new page? Thank you for your help guys your legends!! Cheers Wayne
Technical SEO | | wazza19850