Purchasing duplicate content
-
Morning all,
I have a client who is planning to expand their product range (online dictionary sites) to new markets and are considering the acquisition of data sets from low ranked competitors to supplement their own original data. They are quite large content sets and would mean a very high percentage of the site (hosted on a new sub domain) would be made up of duplicate content. Just to clarify, the competitor's content would stay online as well.
I need to lay out the pros and cons of taking this approach so that they can move forward knowing the full facts. As I see it, this approach would mean forgoing ranking for most of the site and would need a heavy dose of original content as well as supplementing the data on page to build around the data. My main concern would be that launching with this level of duplicate data would end up damaging the authority of the site and subsequently the overall domain.
I'd love to hear your thoughts!
-
Thanks for the great response, some really useful thoughts.
To address your final point, the site is considerably stronger than the content creator's so it's reassuring to hear that this could be the case. Of course we'll be recommending that as much of the data as possible is curated and that the pages are improved with original content/
-
Wow, this is a loaded question. The way I see it we can break this up into two parts.
First, subdomains vs. domains vs. subpages. There has been a lot of discussion surrounding which structure should be used for SEO friendliness and to keep it really simple, if you're concerned about SEO then using a subpage structure is going to be the most beneficial. If you create a separate domain, that will be duplicate content and it does impact rankings. Subdomains are a little more complex, and I don't recommend them for SEO. In some cases, Google views subdomains as spam (think of all the PBNs created with blogspot.com) and in other cases it's viewed as a separate website. By structuring something as a subdomain you're indicating that the content is different enough from the main content of the root domain that you don't feel it should be included together. An example of this being used in the wild appropriately might be different language versions of a website, which especially makes sense in countries where the TLD doesn't represent multiple languages (like Switzerland - they have four national languages).
Next, the concept of duplicate content is different depending on whether it's duplicate internally, or duplicate externally. It's common for websites to have a certain amount of duplicate or common content within their own website. The number that has been repeated for years as a "safe" threshold is 30%, which is a stat that Matt Cutts threw out there before he retired. I use siteliner.com to discover how much common content has been replicated internally. Externally, if you have the same content as another website, this can pretty dramatically impact your rankings. Google does a decent job of assigning content to the correct website (who had it first, etc.) but they have a long way to go.
If you could assimilate the new content and have the pages redirected on a 1:1 basis to the new location then it's probably safe enough to do, and hopefully you will have it structured in a way that makes it useful to users. If you can't perform the redirect, I think you're more likely to struggle with achieving SEO goals for those new pages. In that case, take the time to set realistic expectations and track something like user engagement between new and old content so you have a realistic understanding of your success and challenges.
-
I would be thinking about these topics....
** How many other companies are purchasing or have purchased this data? Is it out there on lots of sites and the number is growing?
** Since this is a low-ranking competitor, how much additional money would be required to simply buy the entire company (provided that the data is not already out there on a ton of other websites.)
** Rather than purchasing this content, what would be the cost of original authorship for just those words that produce a big bulk of the traffic. Certainly 10% of the content produces over 50% of the traffic on most reference sites.
** With knowledge that in most duplicate content situations, a significantly stronger site will crush the same content on the original publisher.... where do I sit in this comparison of power?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate Content
HI There, Hoping someone can help me - before i damage my desk banging my head. Getting notifications from ahrefs and Moz for duplicate content. I have no idea where these weird urls have came from , but they do take us to the correct page (but it seems a duplicate of this page). correct url http://www.acsilver.co.uk/shop/pc/Antique-Vintage-Rings-c152.htm Incorrect url http://www.acsilver.co.uk/shop/pc/vintage-Vintage-Rings- c152.htm This is showing for most of our store categories 😞 Desperate for help as to what could be causing these issues. I have a technical member of the ecommerce software go through the large sitemap files and they assured me it wasn't linked to the sitemap files. Gemma
Technical SEO | | acsilver0 -
Duplicate Page Content
Hello, After crawling our site Moz is detecting high priority duplicate page content for our product and article listing pages, For example http://store.bmiresearch.com/bangladesh/power and http://store.bmiresearch.com/newzealand/power are being listed as duplicate pages although they have seperate URLs, page titles and H1 tags. They have the same product listed but I would have thought the differentiation in other areas would be sufficient for these to not be deemed as duplicate pages. Is it likely this issue will be impacting on our search rankings? If so are there any recommendations as to how this issue can be overcome. Thanks
Technical SEO | | carlsutherland0 -
Finding a specific link - Duplicating my own content
Hi Mozzers, This may be a bit of a n00b question and i feel i should know the answer but alas, here i am asking. I have a page www.website.co.uk/page/ and im getting a duplicate page report of www.website.co.uk/Page/ i know this is because somewhere on my website a link will exists using the capitalised version. I have tried everything i can think of to find it but with no luck, any little tricks? I could always rewrite the urls to lowercase, but I have downloadable software etc also on the website that i dont want to take the capitals out of. So the best solution seems to be finding the link and remove it. Most link checkers I use treat the capitalised and non capitalised as the same thing so really arent helping lol.
Technical SEO | | ATP0 -
Pages with Duplicate Page Content Crawl Diagnostics
I have Pages with Duplicate Page Content in my Crawl Diagnostics Tell Me How Can I solve it Or Suggest Me Some Helpful Tools. Thanks
Technical SEO | | nomyhot0 -
Content and url duplication?
One of the campaign tools flags one of my clients sites as having lots of duplicates. This is true in the sense the content is sort of boiler plate but with the different countries wording changed. The is same with the urls but they are different in the sense a couple of words have changed in the url`s. So its not the case of a cms or server issue as this seomoz advises. It doesnt need 301`s! Thing is in the niche, freight, transport operators, shipping, I can see many other sites doing the same thing and those sites have lots of similar pages ranking very well. In fact one site has over 300 keywords ranked on page 1-2, but it is a large site with an 12yo domain, which clearly helps. Of course having every page content unique is important, however, i suppose it is better than copy n paste from other sites. So its unique in that sense. Im hoping to convince the site owner to change the content over time for every country. A long process. My biggest problem for understanding duplication issues is that every tabloid or broadsheet media website would be canned from google as quite often they scrape Reuters or re-publish standard press releases on their sites as newsworthy content. So i have great doubt that there is a penalty for it. You only have to look and you can see media sites duplication everywhere, everyday, but they get ranked. I just think that google dont rank the worst cases of spammy duplication. They still index though I notice. So considering the business niche has very much the same content layout replicated content, which rank well, is this duplicate flag such a great worry? Many businesses sell the same service to many locations and its virtually impossible to re write the services in a dozen or so different ways.
Technical SEO | | xtopher660 -
Does Google count the same article in different languages as duplicate content?
If the "content" is the same, but is used in separate articles written in entirely different languages, will Google see the articles as duplicate content? How do international companies handle information that needs to be distributed in different languages?
Technical SEO | | BlueLinkERP0 -
Duplicate content with same URL?
SEOmoz is saying that I have duplicate content on: http://www.XXXX.com/content.asp?ID=ID http://www.XXXX.com/CONTENT.ASP?ID=ID The only difference I see in the URL is that the "content.asp" is capitalized in the second URL. Should I be worried about this or is this an issue with the SEOmoz crawl? Thanks for any help. Mike
Technical SEO | | Mike.Goracke0 -
Root domain not resolving to www. Duplicate content?
Hi, I'm working with a domain that stays on the root domain if the www is not included. But if the www is included, it stays with the www. LIke this: example.com
Technical SEO | | HardyIntl
or
www.example.com Of course, they are identical and both go to the same IP. Do search engines consider that to be duplicate content? thanks,
michael0