Concerns of Duplicative Content on Purchased Site
-
Recently I purchased a site of 50+ DA (oldsite.com) that had been offline/404 for 9-12 months from the previous owner. The purchase included the domain and the content previously hosted on the domain. The backlink profile is 100% contextual and pristine.
Upon purchasing the domain, I did the following:
- Rehosted the old site and content that had been down for 9-12 months on oldsite.com
- Allowed a week or two for indexation on oldsite.com
- Hosted the old content on my newsite.com and then performed 100+ contextual 301 redirects from the oldsite.com to newsite.com using direct and wild card htaccess rules
- Issued a Press Release declaring the acquisition of oldsite.com for newsite.com
- Performed a site "Change of Name" in Google from oldsite.com to newsite.com
- Performed a site "Site Move" in Bing/Yahoo from oldsite.com to newsite.com
It's been close to a month and while organic traffic is growing gradually, it's not what I would expect from a domain with 700+ referring contextual domains. My current concern is around original attribution of content on oldsite.com shifting to scraper sites during the year or so that it was offline.
For Example:
- Oldsite.com has full attribution prior to going offline
- Scraper sites scan site and repost content elsewhere (effort unsuccessful at time because google know original attribution)
- Oldsite.com goes offline
- Scraper sites continue hosting content
- Google loses consumer facing cache from oldsite.com (and potentially loses original attribution of content)
- Google reassigns original attribution to a scraper site
- Oldsite.com is hosted again and Google no longer remembers it's original attribution and thinks content is stolen
- Google then silently punished Oldsite.com and Newsite.com (which it is redirected to)
QUESTIONS
- Does this sequence have any merit? Does Google keep track of original attribution after the content ceases to exist in Google's search cache?
- Are there any tools or ways to tell if you're being punished for content being posted else on the web even if you originally had attribution?
- Unrelated: Are there any other steps that are recommend for a Change of site as described above.
-
Hi, John.
Ok, there is a q/a video of matt cutts answering the question about "originality" of content in terms of if bigger website copies content from smaller author-website. (Can't find the link to it, may be other MOZers will help out here). Matt said that yes, it's possible. So, as far as I understand, Google can reassign original attribution. Especially, if your website was offline for long time.
At the same time, here is a Matt Cutts' video about duplicate content as a penalizing factor - https://www.youtube.com/watch?v=mQZY7EmjbMA
According to that video, unless you're very spammy scraper, you are going to be fine in terms of duplicate.
About slow gain of rankings - having lots of referring domains is not the guarantee of fast or good rankings. It surely helps a lot, but it's not the only thing. Have you optimized content, technical SEO etc? As of tools for penalties - use Google Webmaster tools - manual action section. If there is nothing there, you haven't been penalized by google
About any recommendations - well, as I said, update/optimize content if needed, get your technical SEO in order. Since you said the rankings are growing and it has been a month since you've launched website - you're doing pretty good. It always requires time, my friend.
Hope this helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Competition site with Duplicate Name
I have a client who is in real estate and one of his competition used the same Name and Domain and added the word Toronto in front. The competition is putting similar content on the website. The competition also hijacked the Google listing since they both work in the same building. Is this going to affect the ranking our website? What negative effect will it put on our website?
Intermediate & Advanced SEO | | KaranX0 -
Product Syndication and duplicate content
Hi, It's a duplicate content question. We sell products (vacation rental homes) on a number of websites as well as our own. Generally, these affiliate sites have a higher domain authority and much more traffic than our site. The product content (text, images, and often availability and rates) is pulled by our affiliates into their websites daily and is exactly the same as the content on our site, not including their page structure. We receive enquiries by email and any links from their domains to ours are nofollow. For example, all of the listing text on mysite.com/listing_id is identical to my-first-affiliate-site.com/listing_id and my-second-affiliate-site.com/listing_id. Does this count as duplicate content and, if so, can anyone suggest a strategy to make the best of the situation? Thanks
Intermediate & Advanced SEO | | McCaldin0 -
How do I best handle Duplicate Content on an IIS site using 301 redirects?
The crawl report for a site indicates the existence of both www and non-www content, which I am aware is duplicate. However, only the www pages are indexed**, which is throwing me off. There are not any 'no-index' tags on the non-www pages and nothing in robots.txt and I can't find a sitemap. I believe a 301 redirect from the non-www pages is what is in order. Is this accurate? I believe the site is built using asp.net on IIS as the pages end in .asp. (not very familiar to me) There are multiple versions of the homepage, including 'index.html' and 'default.asp.' Meta refresh tags are being used to point to 'default.asp'. What has been done: 1. I set the preferred domain to 'www' in Google's Webmaster Tools, as most links already point to www. 2. The Wordpress blog which sits in a /blog subdirectory has been set with rel="canonical" to point to the www version. What I have asked the programmer to do: 1. Add 301 redirects from the non-www pages to the www pages. 2. Set all versions of the homepage to redirect to www.site.org using 301 redirects as opposed to meta refresh tags. Have all bases been covered correctly? One more concern: I notice the canonical tags in the source code of the blog use a trailing slash - will this create a problem of inconsistency? (And why is rel="canonical" the standard for Wordpress SEO plugins while 301 redirects are preferred for SEO?) Thanks a million! **To clarify regarding the indexation of non-www pages: A search for 'site:site.org -inurl:www' returns only 7 pages without www which are all blog pages without content (Code 200, not 404 - maybe deleted or moved - which is perhaps another 301 redirect issue).
Intermediate & Advanced SEO | | kimmiedawn0 -
Can I duplicate my websites content on Ebay Store?
Our company is setting up a store on Ebay. Is it okay to duplicate our content descriptions on our ebay store with a link going back to our website? Or would this potentially hurt us in Search?
Intermediate & Advanced SEO | | hfranz0 -
Penalised for duplicate content, time to fix?
Ok, I accept this one is my fault but wondering on time scales to fix... I have a website and I put an affiliate store on it, using merchant datafeeds in a bid to get revenue from the site. This was all good, however, I forgot to put noindex on the datafeed/duplicate content pages and over a period of a couple of weeks the traffic to the site died. I have since nofollowed or removed the products but some 3 months later my site still will not rank for the keywords it was ranking for previously. It will not even rank if I type in the sites' name (bright tights). I have searched for the name using bright tights, "bright tights" and brighttights but none of them return the site anywhere. I am guessing that I have been hit with a drop x place penalty by Google for the duplicate content. What is the easiest way around this? I have no warning about bad links or the such. Is it worth battling on trying to get the domain back or should I write off the domain, buy a new one and start again but minus the duplicate content? The goal of having the duplicate content store on the site was to be able to rank the category pages in the store which had unique content on so there were no problems with that which I could foresee. Like Amazon et al, the categories would have lists of products (amongst other content) and you would click through to the individual product description - the duplicate page. Thanks for reading
Intermediate & Advanced SEO | | Grumpy_Carl0 -
Is a "Critical Acclaim" considered duplicate content on an eCommerce site?
I have noticed a lot of wine sites use "Critical Acclaims" on their product pages. These short descriptions made by industry experts are found on thousands of other sites. One example can be found on a Wine.com product page. Wine.com also provides USG through customer reviews on the page for original content. Are the "Critical Acclaim" descriptions considered duplicate content? Is there a way to use this content and it not be considered duplicate (i.e. link to the source)?
Intermediate & Advanced SEO | | mj7750 -
How are they avoiding duplicate content?
One of the largest stores in USA for soccer runs a number of whitelabel sites for major partners such as Fox and ESPN. However, the effect of this is that they are creating duplicate content for their products (and even the overall site structure is very similar). Take a look at: http://www.worldsoccershop.com/23147.html http://www.foxsoccershop.com/23147.html http://www.soccernetstore.com/23147.html You can see that practically everything is the same including: product URL product title product description My question is, why is Google not classing this as duplicate content? Have they coded for it in a certain way or is there something I'm missing which is helping them achieve rankings for all sites?
Intermediate & Advanced SEO | | ukss19840 -
Duplicate Content from Article Directories
I have a small client with a website PR2, 268 links from 21 root domains with mozTrusts 5.5, MozRank 4.5 However whenever I check in google for the amount of link: Google always give the response none. My client has a blog and many articles on the blog. However they have submitted their blog article every time to article directories as well, plain and simle creating duplicate and content. Is this the reason why their link: is coming up as none? Is there something to correct the situation?
Intermediate & Advanced SEO | | danielkamen0