Concerns of Duplicative Content on Purchased Site
-
Recently I purchased a site of 50+ DA (oldsite.com) that had been offline/404 for 9-12 months from the previous owner. The purchase included the domain and the content previously hosted on the domain. The backlink profile is 100% contextual and pristine.
Upon purchasing the domain, I did the following:
- Rehosted the old site and content that had been down for 9-12 months on oldsite.com
- Allowed a week or two for indexation on oldsite.com
- Hosted the old content on my newsite.com and then performed 100+ contextual 301 redirects from the oldsite.com to newsite.com using direct and wild card htaccess rules
- Issued a Press Release declaring the acquisition of oldsite.com for newsite.com
- Performed a site "Change of Name" in Google from oldsite.com to newsite.com
- Performed a site "Site Move" in Bing/Yahoo from oldsite.com to newsite.com
It's been close to a month and while organic traffic is growing gradually, it's not what I would expect from a domain with 700+ referring contextual domains. My current concern is around original attribution of content on oldsite.com shifting to scraper sites during the year or so that it was offline.
For Example:
- Oldsite.com has full attribution prior to going offline
- Scraper sites scan site and repost content elsewhere (effort unsuccessful at time because google know original attribution)
- Oldsite.com goes offline
- Scraper sites continue hosting content
- Google loses consumer facing cache from oldsite.com (and potentially loses original attribution of content)
- Google reassigns original attribution to a scraper site
- Oldsite.com is hosted again and Google no longer remembers it's original attribution and thinks content is stolen
- Google then silently punished Oldsite.com and Newsite.com (which it is redirected to)
QUESTIONS
- Does this sequence have any merit? Does Google keep track of original attribution after the content ceases to exist in Google's search cache?
- Are there any tools or ways to tell if you're being punished for content being posted else on the web even if you originally had attribution?
- Unrelated: Are there any other steps that are recommend for a Change of site as described above.
-
Hi, John.
Ok, there is a q/a video of matt cutts answering the question about "originality" of content in terms of if bigger website copies content from smaller author-website. (Can't find the link to it, may be other MOZers will help out here). Matt said that yes, it's possible. So, as far as I understand, Google can reassign original attribution. Especially, if your website was offline for long time.
At the same time, here is a Matt Cutts' video about duplicate content as a penalizing factor - https://www.youtube.com/watch?v=mQZY7EmjbMA
According to that video, unless you're very spammy scraper, you are going to be fine in terms of duplicate.
About slow gain of rankings - having lots of referring domains is not the guarantee of fast or good rankings. It surely helps a lot, but it's not the only thing. Have you optimized content, technical SEO etc? As of tools for penalties - use Google Webmaster tools - manual action section. If there is nothing there, you haven't been penalized by google
About any recommendations - well, as I said, update/optimize content if needed, get your technical SEO in order. Since you said the rankings are growing and it has been a month since you've launched website - you're doing pretty good. It always requires time, my friend.
Hope this helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Breaking up a site into multiple sites
Hi, I am working on plan to divide up mid-number DA website into multiple sites. So the current site's content will be divided up among these new sites. We can't share anything going forward because each site will be independent. The current homepage will change to just link out to the new sites and have minimal content. I am thinking the websites will take a hit in rankings but I don't know how much and how long the drop will last. I know if you redirect an entire domain to a new domain the impact is negligible but in this case I'm only redirecting parts of a site to a new domain. Say we rank #1 for "blue widget" on the current site. That page is going to be redirected to new site and new domain. How much of a drop can we expect? How hard will it be to rank for other new keywords say "purple widget" that we don't have now? How much link juice can i expect to pass from current website to new websites? Thank you in advance.
Intermediate & Advanced SEO | | timdavis0 -
Country Code Top Level Domains & Duplicate Content
Hi looking to launch in a new market, currently we have a .com.au domain which is geo-targeted to Australia. We want to launch in New Zealand which is ends with .co.nz If i duplicate the Australian based site completely on the new .co.nz domain name, would i face duplicate content issues from a SEO standpoint?
Intermediate & Advanced SEO | | jayoliverwright
Even though it's on a completely separate country code. Or is it still advised tosetup hreflang tag across both of the domains? Cheers.0 -
Search Causing Duplicate Content
I use Opencart and have found that a lot of my duplicate content (mainly from Products) which is caused by the Search function. Is there a simple way to tell Google to ignore the Search function pathway? Or is this particular action not recommended? Here are two examples: http://thespacecollective.com/index.php?route=product/search&tag=cloth http://thespacecollective.com/index.php?route=product/search
Intermediate & Advanced SEO | | moon-boots0 -
Spammy sites that link to a site
Hello, What is the best and quickest way to identify spammy sites that link to a website, and then remove them ( google disavow?) Thank you dear Moz, community - I appreciate your help 🙂 Sincerely, Vijay
Intermediate & Advanced SEO | | vijayvasu0 -
Problems with ecommerce filters causing duplicate content.
We have an ecommerce website with 700 pages. Due to the implementation of filters, we are seeing upto 11,000 pages being indexed where the filter tag is apphended to the URL. This is causing duplicate content issues across the site. We tried adding "nofollow" to all the filters, we have also tried adding canonical tags, which it seems are being ignored. So how can we fix this? We are now toying with 2 other ideas to fix this issue; adding "no index" to all filtered pages making the filters uncrawble using javascript Has anyone else encountered this issue? If so what did you do to combat this and was it successful?
Intermediate & Advanced SEO | | Silkstream0 -
Are all duplicate content issues bad? (Blog article Tags)
If so how bad? We use tags on our blog and this causes duplicate content issues. We don't use wordpress but with such a highly used cms having the same issue it seems quite plausible that Google would be smart enough to deal with duplicate content issues caused by blog article tags and not penalise at all. Here it has been discussed and I'm ready to remove tags from our blog articles or monitor them closely to see how it effects our rankings. Before I do, can you give me some advice around this? Thanks,
Intermediate & Advanced SEO | | Daniel_B
Daniel.0 -
What is the best way to allow content to be used on other sites for syndication without taking the chance of duplicate content filters
Cookstr appears to be syndicating content to shape.com and mensfitness.com a) They integrate their data into partner sites with an attribution back to their site and skinned it with the partners look. b) they link the image back to their image hosted on cookstr c) The page does not have microformats or as much data as their own page does so their own page is better SEO. Is this the best strategy or is there something better they could be doing to safely allow others to use our content, we don't want to share the content if we're going to get hit for a duplicate content filter or have another site out rank us with our own data. Thanks for your help in advance! their original content page: http://www.cookstr.com/recipes/sauteacuteed-escarole-with-pancetta their syndicated content pages: http://www.shape.com/healthy-eating/healthy-recipes/recipe/sauteacuteed-escarole-with-pancetta
Intermediate & Advanced SEO | | irvingw
http://www.mensfitness.com/nutrition/healthy-recipes/recipe/sauteacuteed-escarole-with-pancetta0 -
Duplicate content question? thanks
Hi, Im my time as an SEO I have never come across the following two scenarios, I am an advocate of using unique content, therefore always suggest and in cases demand that all content is written or re-written. This is the scenarios I am facing right now. For Example we have www.abc.com (has over 200 original recipes) and then we have www.xyz.com with the recipes but they are translated into another language as they are targeting different audiences, will Google penalize for duplicate content? The other issue is that the client got the recipes from www.abc.com (that have been translated) and use them in www.xyz.com aswell, both sites owned by the same company so its not pleagurism they have legal rights but I am not sure how Google will see it and if it will penalize the sites. Thanks!
Intermediate & Advanced SEO | | M_81