Can I use content from an existing site that is not up anymore?
-
I want to take down a current website and create a new site or two (with new url, ip, server). Can I use the content from the deleted site on the new sites since I own it? How will Google see that?
-
Thank you. That is a great answer!
-
Hi there,
I would say that, taking William's point into account, canonicals might work in order to remove any possibility that Google would see the new site as copying the old one. That said, I can't guarantee that they could not either manually or automatically (manually would be much easier) note that the two sites are owned by the same person and that the domain change is a measure taken to avoid a penalty. The truly safest thing to do is to re-write the content and start afresh. The next safest is to remove the content from the old site, force a re-crawl / wait for Google to update its cache of the old site excluding the content, and then re-publish on the new site.
Canonicals will make this process quicker, but I don't believe it can be guaranteed that they won't result in Google making a stronger connection between the two sites, which might not go well. Again, this is only if there are enough similarities for Google to understand that this is not a scraper / scrapee situation but a situation where one entity owns both sites.
I'm sorry not to give a definitive answer.
-
After reading Jane & William's discussion--do you both agree that canonicals is the way to go? The site will be similar (trying to create a non-penalized site). The sites will have different ip's and servers but a lot of the same content. None of the same backlinks... I just don't want to do the work if it's going to end up hurting me worse. I don't see how I can get all those bad backlinks removed.
-
Really good point. Taking that into account, I might guess that an anti-manipulation method Google might employ is to grab registration details, hosting data, analytics codes, etc. and other identifying factors to determine whether the canonicalised content is owned by the same person. That is, canonicals between tightly-linked sites where the "duplicate" is penalised could hurt the canonical source, stopping people using this in place of the old 301 trick. If the scraper site has nothing in common with the source, Google does not pass on any negative metric from the duplicate.
This is just a theory too of course! I'd be confident assuming that they're taking precautions to stop this becoming a common trick. Awesome point!
-
The thought behind canonicals is this:
-
One of their uses is to fight against scrapers and such by still having the canonical tags in place when these spammy places grab your content.
-
If penalties passed through canonicals, then the penalties these scrapers have would effect your site terribly. This is not the case, in my experience.
-
So, unless Google has already implemented the human tracking that was discussed a few Whiteboard Fridays ago, this should work. And even with hardcore human tracking for penalities, I think its yet to be seen if this would focus on small sites trying to fix penalities as opposed to the large black hat spammers.
There is a bit of theorycrafting here, but in RoxBrock's specific situation, it looks like he has to pick the lesser of all evils.
-
-
The idea of using canonicals interests me, but I am not 100% sure it is risk-free. It used to be the case that you could 301 penalised websites and remove the penalty (we're talking 2010 and earlier here). Google is very keen on transferring penalties these days, so I would be surprised if they are leaving a loophole for canonical tags open like this, or if they will keep that loophole open for long.
You would ideally leave the site live and remove its content as William says - once you see that the cached version of the site no longer contains the content you want to move, you can feel free to take the old site down and put the content up on the new site.
We don't know what lengths Google is going to or will go to to avoid people being able to re-use previously penalised content (including good content from penalised websites) but the safest thing you can do whilst using this old content right now is ensure the old content has been deindexed before putting it up again elsewhere.
The actual safest thing you can do is re-write the content, but I realise this might not be possible.
-
Put the canonical tags in the old content, and point it to the new pages.
If you believe there are penalties, then 301ing is a little risky.
De-indexing content doesn't mean Google forgets it was there, they still have it cached, so this isn't ideal.
It looks like canonical may be your best bet.
-
So you suggest leaving the old site up and add the content to the new site with the canonical tag pointing to old site? Any other options you can think of?
-
You would need to keep the site live to speed up the de-indexation. Then block all bots through robots.txt and force a crawl.
Make sure this is what you want to do. There are other options for this situation depending on your intent. Canonical tags, for example, would not transfer penalties and still show Google where the good source of the content is.
-
Many bad links were built on the old website by a questionable SEO firm, so I do believe the URL has been hit, but not with a formal penalty.
In order to redirect the old web pages I would need to keep the website live which really does not serve my purpose--which is to use great content that was written in-house on a clean website with no backlinks (starting from scratch).
How would one go about "de-indexing" content?
Thank you for prompt responses.
-
301 redirect the old web pages to the new ones using an .htaccess file on the old website. This will show Google that the content has moved to the new web pages. Check out the link for more information: http://moz.com/learn/seo/redirection
-
Interesting question!
I had to do some research on this, there is not much out there. One place I was sure to find and answer was the depths of the underworld in blackhat forums. I found a whole discussion on it from 6 months back. (Not going to link to a black hat site, sorry)
However what they said and had tried and tested was that the site must be de-indexed and the same for all pages so that it did not trip the duplicate content.
However lets back things up a little. Why are you doing this? Does the original have a penalty?
Why not keep the original live and put a canonical link in your page pointing to the new site stating that is the original content owner? this way you will get traffic right away and not have to start ranking from scratch.
Need to know more about your reasons please.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
My site in 2 page
my site in 2 page how can i rank with this keywords in dubai legal translation in Dubai
White Hat / Black Hat SEO | | saharali150 -
Would this be duplicate content or bad SEO?
Hi Guys, We have a blog for our e-commerce store. We have a full-time in-house writer producing content. As part of our process, we do content briefs, and as part of the brief we analyze competing pieces of content existing on the web. Most of the time, the sources are large publications (i.e HGTV, elledecor, apartmenttherapy, Housebeautiful, NY Times, etc.). The analysis is basically a summary/breakdown of the article, and is sometimes 2-3 paragraphs long for longer pieces of content. The competing content analysis is used to create an outline of our article, and incorporates most important details/facts from competing pieces, but not all. Most of our articles run 1500-3000 words. Here are the questions: Would it be considered duplicate content, or bad SEO practice, if we list sources/links we used at the bottom of our blog post, with the summary from our content brief? Could this be beneficial as far as SEO? If we do this, should be nofollow the links, or use regular dofollow links? For example: For your convenience, here are some articles we found helpful, along with brief summaries: <summary>I want to use as much of the content that we have spent time on. TIA</summary>
White Hat / Black Hat SEO | | kekepeche1 -
Having a Size Chart and Personalization Descriptions on each page - Duplicate Content?
Hi everyone, I am coding a Shopify Store theme currently and we want to show customers the size comparisons and personalization options for each product. It will be a great UX addition since it is the number one & two things asked via customer support. But my only concern is that Google might flag it as duplicate content since it will be visible on each product page. What are your thoughts and/or suggestions? Thank you so much in advance.
White Hat / Black Hat SEO | | MadeByBrew0 -
How do I make a content calendar to increase my rank for a key word?
I've watched more than a few seminars on having a content calendar. Now I'm curious as to what I would need to do to increase ranking for a specific keyword in local SEO. Let's say I wanted to help them increase their rank for used trucks in buffalo, NY. Would I regularly publish blog posts about used trucks? Thanks!
White Hat / Black Hat SEO | | oomdomarketing0 -
Guest post linking only to good content
Hello, We're thinking of doing guest posting of the following type: 1. The only link is in the body of the guest post pointing to our most valuable article. 2. It is not a guest posting site - we approached them to help with content, they don't advertise guest posting. They sometimes use guest posting if it's good content. 3. It is a clean site - clean design, clean anchor text profile, etc. We have 70 linking root domains. We want to use the above tactics to add 30 more links. Is this going to help us on into the future of Google (We're only interested in long term)? Is 30 too many? Thanks.
White Hat / Black Hat SEO | | BobGW0 -
Is it okay to use eLocal services?
Is it okay to use a service like eLocal's 'reach the web' to clean up our company listings on website directories or is it considered black hat? Our company name and address is inconsistent on many of the website directories and we want to clean it up fast. eLocal has a service that can do this. I just want to make sure it's not considered bad to have a vendor do it. Thanks!
White Hat / Black Hat SEO | | KristyFord0 -
Content ideas?
We run a printing company and we are struggling to come up with unique content people will actually want to know, is there any way of getting the ball rolling? We were thinking of ideas such as exhibition guide but this seems to have been overdone. Any help would be appreciated.
White Hat / Black Hat SEO | | BobAnderson0 -
Can good penalize a site, and stop it ranking under a keyword permanently
hi all we recently took on a new client, asking us to improve there google ranking, under the term letting agents glasgow , they told us they used to rank top 10 but now are on page 14 so it looks like google has slapped them one, my question is can google block you permanently from ranking under a keyword or disadvantage you, as we went though the customers links, and removed the ones that looked strange, and kept the links that looked ok. but then there ranking dropped to 21, is it worth gaining new links under there main keyword even tho it looks like google is punishing them for having some bad links. the site is www. fine..lets...ltd...co....uk all one word cheers
White Hat / Black Hat SEO | | willcraig0