Buying a disused website and using their content - penalty risk?
-
Hi all, I'm in the process of setting up a new website.
I have found various old websites covering a similar topic and I'm interested in purchasing two of these websites for their content as it is very good, despite those sites struggling to make ends meet.
One of these websites is still live, the other one hasn't been live for 2 years.
Let's say I bought these websites for their content, then used that content on my new domain and made sure the two websites where this content came from were offline, would I run a risk of getting penalised? Does Google hold onto content from a website even if it is now offline?
-
Brilliant, thanks Gaston
-
Hello Bee,
In my opinion, the non-risky way to use the other site content (owning that site) is removing that site from Google's index.
So, you should first (after buying those sites) apply noindex robots meta tag and wait intill there is no result whem performing a site:website.com search.Even though, be sure that other sites aren't using that content when you de index the first (and old) site. I've read some grey/black hat techniques that scrape web.archive.org looking for taken down sites' content.
Hope it helps.
GR. -
Thanks for the reply Roman, I probably wasn't clear. I'll try to clarify:
Say there is a websiteA.com which was established for years but then was taken down and hasn't been live for a couple of years. If I bought website A then rolled lots of its content into a new website, let's called it websiteB.com would I be risking any penalties on the new domain?
Thanks.
-
The answer is very simple, if you want to use the content of other website, and you are the owner just need to use Rel="canonical"
The process for dealing with duplicate content is to use the rel=canonical attribute. This tells search engines that a given page should be treated as though it were a copy of a specified URL, and all of the links, content metrics, and "ranking power" that search engines apply to this page should actually be credited to the specified URL.
The rel=canonical attribute should be added to the HTML head of each duplicate version of a page, with the "URL OF ORIGINAL PAGE" portion above replaced by a link to the original (canonical) page. (Make sure you keep the quotation marks.) The attribute passes roughly the same amount of link equity (ranking power) as a 301 redirect, and, because it's implemented at the page (instead of server) level, often takes less development time to implement.
Regards
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicating content from manufacturer for client site and using canonical reference.
We manage content for many clients in the same industry, and many of them wish to keep their customers on their individualized websites (understandably). In order to do this, we have duplicated content in part from the manufacturers' pages for several "models" on the client's sites. We have put in a Canonical reference at the start of the content directing back to the manufacturer's page where we duplicated some of the content. We have only done a handful of pages while we figure out the canonical reference potential issue. So, my questions are: Is this necessary? Does this hurt, help or not do anything SEO-wise for our ranking of the site? Thanks!
Intermediate & Advanced SEO | | moz1admin1 -
Do I use H1 tag for logo or page content?
Should the h1 tag be used for the main page content or the logo? I understand the original method was too H1 the logo with the main search term, does this still hold true or should it be content focused?
Intermediate & Advanced SEO | | seoman100 -
GWT does not play nice with 410 status code approach to expire content? Use 301s?
We have been diligently managing our index size in Google for our sites and are returning a 410 status code for pages that we no longer consider "up-to-date" but still carry value for users to access to have Google remove them from our index to keep it lean. However we have been receiving GWT warning across sites because of the 410 status codes Google is encountering which makes us nervous that Google could interpret this approach as a lack of quality of our site. Does anyone have a view if the 410 approach is the right approach for the given example or if we should consider maybe simply using 301s or another status code to keep our GWT errors clean? Further notes there is hardly ever any link juice being sent to those pages so it is not like we are missing out on that the pages for which we return 410 are also marked as noindex and nofollow
Intermediate & Advanced SEO | | petersocapro0 -
Getting Your Website Listed
Do you have any suggestiongs? I do not know local websites where I can get some easy backlinks. I guess a record in Google Places.would be great as well. Any sound suggestion will be appreciated. Thanks!
Intermediate & Advanced SEO | | stradiji0 -
Websites with same content
Hi, Both my .co.uk and .ie websites have the exact same content which consists of hundreds of pages, is this going to cause an issue? I have a hreflang on both websites plus google webmaster tools is picking up that both websites are targeting different counties. Thanks
Intermediate & Advanced SEO | | Paul780 -
Duplicate Content Issue
Why do URL with .html or index.php at the end are annoying to the search engine? I heard it can create some duplicate content but I have no idea why? Could someone explain me why is that so? Thank you
Intermediate & Advanced SEO | | Ideas-Money-Art0 -
Duplicate content
Is there manual intervention required for a site that has been flagged for duplicate content to get back to its original rankings, once the duplicated content has been removed? Background: Our site recently experienced a significant drop in traffic around the time that a chunk of content from other sites (ie. duplicate) went live. While it was not an exact replica of the pages on other sites, there was quite a bit of overlap. That content has since been removed, but our traffic hasn't improved. What else can we do to improve our ranking?
Intermediate & Advanced SEO | | jamesti0 -
How would you optimise a news website?
I have been asked for advice on how to optimise a news website whose keywords, almost by definition, change every day according to the articles being written. How would you, for example, do SEO for the NYtimes.com? Great content and subsequent links I'm sure take care of themselves. Just onsite then? If so.... what?
Intermediate & Advanced SEO | | seomasters0