Duplicate Content Penalties, International Sites
-
We're in the process of rolling out a new domestic (US) website design. If we copy the same theme/content to our International subsidiaries, would the duplicate content penalty still apply? All International sites would carry the Country specific domain, .co.uk, .eu, etc. This question is for English only content, I'm assuming translated content would not carry a penalty.
-
The consensus is that even though the content is the same, that it will rank locally using country specific domains. Can anyone provide examples where this is currently working?
-
I use rackspace | cloud sites. Is there a way I can request to have a domain pushed to a pool you have in UK or CA for example?
-
This Video from Matt Cutts will help too http://www.youtube.com/watch?v=Ets7nHOV1Yo
-
I asked this exact question to Greg Grothaus from Google at a conference back in 2009, and his answer was that duplicated content across different TLDs should'nt be something to be too concerned about. Realistically, search engines will decide which version of the site is more relevant for a particular geographic audience.
-
When it comes to English... I just advice that there are ways to make "different" a content. Just think to how different Brits and Americans write many words. Then all the classic International SEO tactics (links for the country your site have to rank, IP, address...)
Apart that, if you have the international sites with their corresponding Tld (.co.uk, .au, .in...) and you specify that the .com is for the USA Google, actually Google is quite good in noticing what site should have to rank for any country.
-
Yes. Translated content will not be considered a penalty as long as long as you launch the site on a domain with proper local TLD and add locally targeted content then you should be ok. Additionally, you may want to consider hosting the website with a local hosting provider.
This should also apply to an English language content modified for UK audience since UK English is technically considered different than the US. We have multiple English language international websites hosted on local TLDs that rank locally for the respective keywords.
Google has become much smarter in terms of detecting the geo local elements and it should serve the appropriate site on the SERP without causing duplicate content issues.
-
I think this sort of duplicate content is something that Google sees often. If you are copying everything exactly between domains I’d question if you need multiple sites. Presuming your content has country specific differences you’ll be ok.
Do not forget to register in Google webmaster tools your target market for each URL. Maybe build some new links in each local at the time of launch (press mentions, twitter shout outs etc).
Also you may want to consider the approach taken by Microsoft. One domain with country specific folders e.g.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate content through product variants
Hi, Before you shout at me for not searching - I did and there are indeed lots of threads and articles on this problem. I therefore realise that this problem is not exactly new or unique. The situation: I am dealing with a website that has 1 to N (n being between 1 and 6 so far) variants of a product. There are no dropdown for variants. This is not technically possible short of a complete redesign which is not on the table right now. The product variants are also not linked to each other but share about 99% of content (obvious problem here). In the "search all" they show up individually. Each product-variant is a different page, unconnected in backend as well as frontend. The system is quite limited in what can be added and entered - I may have some opportunity to influence on smaller things such as enabling canonicals. In my opinion, the optimal choice would be to retain one page for each product, the base variant, and then add dropdowns to select extras/other variants. As that is not possible, I feel that the best solution is to canonicalise all versions to one version (either base variant or best-selling product?) and to offer customers a list at each product giving him a direct path to the other variants of the product. I'd be thankful for opinions, advice or showing completely new approaches I have not even thought of! Kind Regards, Nico
Technical SEO | | netzkern_AG0 -
Handling of Duplicate Content
I just recently signed and joined the moz.com system. During the initial report for our web site it shows we have lots of duplicate content. The web site is real estate based and we are loading IDX listings from other brokerages into our site. If though these listings look alike, they are not. Each has their own photos, description and addresses. So why are they appear as duplicates – I would assume that they are all too closely related. Lots for Sale primarily – and it looks like lazy agents have 4 or 5 lots and input the description the same. Unfortunately for us, part of the IDX agreement is that you cannot pick and choose which listings to load and you cannot change the content. You are either all in or you cannot use the system. How should one manage duplicate content like this? Or should we ignore it? Out of 1500+ listings on our web site it shows 40 of them are duplicates.
Technical SEO | | TIM_DOTCOM0 -
PR / News stories across multiple sites - is it still duplicate content?
I was wondering does Google make an exception for news stories where duplicate content is concerned? After all depending on the story there can be a lot of quotes and bulk blocks of the same details. Is Google intelligent enough to distinguish between general website content and actual news stories? Also like a lot of big firms we publish news stories on our website, but then they get passed on to other websites in the form of PR, and then published on other websites. So if we put it on our website, then within a few hours or the same day other websites publish the story at the same time (literally copied and pasted) - how does this affect our website in terms of duplicate content? Will Google know automatically that we published it first? Thanks!
Technical SEO | | Brabian0 -
404 and Duplicate Content.
I just submitted my first campaign. And it's coming up with a LOT of errors. Many of them I feel are out of my control as we use a CMS for RV dealerships. But I have a couple of questions. I got a 404 error and SEO Moz tells me the link, but won't tell me where that link originated from, so I don't know where to go to fix it. I also got a lot of duplicate content, and it seems a lot of them are coming from "tags" on my blog. Is that something I should be concerned about? I will have a lot more question probably as I'm new to using this tool Thanks for the responses! -Brandon here is my site: floridaoutdoorsrv.com I welcome any advice or input!
Technical SEO | | floridaoutdoorsrv0 -
Duplicate Content
The crawl shows a lot of duplicate content on my site. Most of the urls its showing are categories and tags (wordpress). so what does this mean exactly? categories is too much like other categories? And how do i go about fixing this the best way. thanks
Technical SEO | | vansy0 -
Internal linking with Old Content
Hello, I have a sports website in which users write their opinions about the sporting events that take place every day throughout the year. Each of these sporting events generates a new page or URL indicating the match with date. For example: www.domain.com/baseball/boston-v-yankees-04-24-2012-1234.html The teams face several times a year, and each match creates a different URL or page. I would like to link old pages to new pages and vice versa. How would you recommend these pages to be linked? Linking them to each other or linking old pages to new pages that are generated or otherwise? I would appreciate your orientation and help in this case. Thank you.
Technical SEO | | NorbertoMM1 -
Cross-domain duplicate content issue
Hey all, Just double-checking something. Here's the issue, briefly. One of my clients is a large law firm. The firm has a main site, and an additional site for an office in Atlanta. On the main site, there is a list of all attorneys and links to their profiles (that they wrote themselves). The Atlanta site has this as well, but lists only the attorneys located in that office. I would like to have the profiles for the Atlanta lawyers on both sites. Would rel=canonical work to avoid a dupe-content smackdown? The profiles should rank for Atlanta over the main site. This just means that G will drop the main site's profiles (for those attorneys) from their index, correct? No other weird side effects? I hope I worded all that clearly!
Technical SEO | | LCNetwork0 -
Duplicate Content Caused By Blog Filters
We are getting some duplicate content warnings based on our blog. Canonical URL's can work for some of the pages, but most of the duplicate content is caused by blog posts appearing on more than 1 URL. What is the best way to fix this?
Technical SEO | | Marketpath0