Can I use content from an existing site that is not up anymore?
-
I want to take down a current website and create a new site or two (with new url, ip, server). Can I use the content from the deleted site on the new sites since I own it? How will Google see that?
-
Thank you. That is a great answer!
-
Hi there,
I would say that, taking William's point into account, canonicals might work in order to remove any possibility that Google would see the new site as copying the old one. That said, I can't guarantee that they could not either manually or automatically (manually would be much easier) note that the two sites are owned by the same person and that the domain change is a measure taken to avoid a penalty. The truly safest thing to do is to re-write the content and start afresh. The next safest is to remove the content from the old site, force a re-crawl / wait for Google to update its cache of the old site excluding the content, and then re-publish on the new site.
Canonicals will make this process quicker, but I don't believe it can be guaranteed that they won't result in Google making a stronger connection between the two sites, which might not go well. Again, this is only if there are enough similarities for Google to understand that this is not a scraper / scrapee situation but a situation where one entity owns both sites.
I'm sorry not to give a definitive answer.
-
After reading Jane & William's discussion--do you both agree that canonicals is the way to go? The site will be similar (trying to create a non-penalized site). The sites will have different ip's and servers but a lot of the same content. None of the same backlinks... I just don't want to do the work if it's going to end up hurting me worse. I don't see how I can get all those bad backlinks removed.
-
Really good point. Taking that into account, I might guess that an anti-manipulation method Google might employ is to grab registration details, hosting data, analytics codes, etc. and other identifying factors to determine whether the canonicalised content is owned by the same person. That is, canonicals between tightly-linked sites where the "duplicate" is penalised could hurt the canonical source, stopping people using this in place of the old 301 trick. If the scraper site has nothing in common with the source, Google does not pass on any negative metric from the duplicate.
This is just a theory too of course! I'd be confident assuming that they're taking precautions to stop this becoming a common trick. Awesome point!
-
The thought behind canonicals is this:
-
One of their uses is to fight against scrapers and such by still having the canonical tags in place when these spammy places grab your content.
-
If penalties passed through canonicals, then the penalties these scrapers have would effect your site terribly. This is not the case, in my experience.
-
So, unless Google has already implemented the human tracking that was discussed a few Whiteboard Fridays ago, this should work. And even with hardcore human tracking for penalities, I think its yet to be seen if this would focus on small sites trying to fix penalities as opposed to the large black hat spammers.
There is a bit of theorycrafting here, but in RoxBrock's specific situation, it looks like he has to pick the lesser of all evils.
-
-
The idea of using canonicals interests me, but I am not 100% sure it is risk-free. It used to be the case that you could 301 penalised websites and remove the penalty (we're talking 2010 and earlier here). Google is very keen on transferring penalties these days, so I would be surprised if they are leaving a loophole for canonical tags open like this, or if they will keep that loophole open for long.
You would ideally leave the site live and remove its content as William says - once you see that the cached version of the site no longer contains the content you want to move, you can feel free to take the old site down and put the content up on the new site.
We don't know what lengths Google is going to or will go to to avoid people being able to re-use previously penalised content (including good content from penalised websites) but the safest thing you can do whilst using this old content right now is ensure the old content has been deindexed before putting it up again elsewhere.
The actual safest thing you can do is re-write the content, but I realise this might not be possible.
-
Put the canonical tags in the old content, and point it to the new pages.
If you believe there are penalties, then 301ing is a little risky.
De-indexing content doesn't mean Google forgets it was there, they still have it cached, so this isn't ideal.
It looks like canonical may be your best bet.
-
So you suggest leaving the old site up and add the content to the new site with the canonical tag pointing to old site? Any other options you can think of?
-
You would need to keep the site live to speed up the de-indexation. Then block all bots through robots.txt and force a crawl.
Make sure this is what you want to do. There are other options for this situation depending on your intent. Canonical tags, for example, would not transfer penalties and still show Google where the good source of the content is.
-
Many bad links were built on the old website by a questionable SEO firm, so I do believe the URL has been hit, but not with a formal penalty.
In order to redirect the old web pages I would need to keep the website live which really does not serve my purpose--which is to use great content that was written in-house on a clean website with no backlinks (starting from scratch).
How would one go about "de-indexing" content?
Thank you for prompt responses.
-
301 redirect the old web pages to the new ones using an .htaccess file on the old website. This will show Google that the content has moved to the new web pages. Check out the link for more information: http://moz.com/learn/seo/redirection
-
Interesting question!
I had to do some research on this, there is not much out there. One place I was sure to find and answer was the depths of the underworld in blackhat forums. I found a whole discussion on it from 6 months back. (Not going to link to a black hat site, sorry)
However what they said and had tried and tested was that the site must be de-indexed and the same for all pages so that it did not trip the duplicate content.
However lets back things up a little. Why are you doing this? Does the original have a penalty?
Why not keep the original live and put a canonical link in your page pointing to the new site stating that is the original content owner? this way you will get traffic right away and not have to start ranking from scratch.
Need to know more about your reasons please.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Question regarding subdomains and duplicate content
Hey everyone, I have another question regarding duplicate content. We are planning on launching a new sector in our industry to satisfy a niche. Our main site works as a directory with listings with NAP. The new sector that we are launching will be taking all of the content on the main site and duplicating it on a subdomain for the new sector. We still want the subdomain to rank organically, but I'm having struggles between putting a rel=canonical back to main site, or doing a self-referencing canonical, but now I have duplicates. The other idea is to rewrite the content on each listing so that the menu items are still the same, but the listing description is different. Do you think this would be enough differentiating content that it won't be seen as a duplicate? Obviously make this to be part of the main site is the best option, but we can't do that unfortunately. Last question, what are the advantages or disadvantages of doing a subdomain?
White Hat / Black Hat SEO | | imjonny0 -
Active, Old Large site with SEO issues... Fix or Rebuild?
Looking for opinions and guidance here. Would sincerely appreciate help. I started a site long, long ago (1996 to be exact) focused on travel in the US. The site did very well in the search results up until panda as I built it off templates using public databases to fill in the blanks where I didn't have curated content. The site currently indexes around 310,000 pages. I haven't been actively working on the site for years and while user content has kept things somewhat current, I am jumping back into this site as it provides income for my parents (who are retired). My questions is this. Will it be easier to track through all my issues and repair, or rebuild as a new site so I can insure everything is in order with today's SEO? and bonus points for this answer ... how do you handle 301 redirects for thousands of incoming links 😕 Some info to help: CURRENTLY DA is in the low 40s some pages still rank on first page of SERPs (long-tail mainly) urls are dynamic (I have built multiple versions through the years and the last major overhaul was prior to CMS popularity for this size of site) domain is short (4 letters) but not really what I want at this point Lots of original content, but oddly that content has been copied by other sites through the years WHAT I WANT TO DO get into a CMS so that anyone can add/curate content without needing tech knowledge change to a more relevant domain (I have a different vision) remove old, boilerplate content, but keep original
White Hat / Black Hat SEO | | Millibit1 -
How should I use the 2nd link if a site allows 2 in the body of a guest post?
I've been doing some guest posting, and some sites allow one link, others allow more. I'm worried I might be getting too many guest posts with multiple links. I'd appreciate your thoughts on the following: 1. If there are 50+ guest posts going to my website (posted over the span of several months), each with 2 links pointing back only to my site is that too much of a pattern? How would you use the 2nd link in a guest post if not to link to your own site? 2. Does linking to .edu or .gov in the guest post make the post more valuable in terms of SEO? Some people recommend using the 2nd link to do this. Thanks!
White Hat / Black Hat SEO | | pbhatt0 -
Using competitor brand names. How far is too far?
We are a small company competing for traffic in an industry with more or less one other very large brand. I'm noticing we are getting a descent amount of organic traffic for the competitor's brand name however I haven't done any on-page inclusion or link building for the term. We are using their brand as a keyword in our paid campaigns and seeing potential. I firmly believe we have a superior product. I'm tempted to start going after our competitor's brand as a keyword to skim some of their traffic. My question is how far it too far? Do I actively try to obtain a few anchor text specific backlinks? Dare I use their brand name as a term on our page? Maybe just a simple blog post comparing our two products is more appropriate? Any suggestions are appreciated.
White Hat / Black Hat SEO | | CaliB0 -
SERPs recovery? When can I believe it?
Here's a happy story: Some of you folks with sharp memories may remember my questions and worry over the last 3+ months regarding our fall into the abyss on Google after great positions for over a decade (we've always been fine in Bing and Yahoo). And our company name URL was still #1 so no site-wide penalty. Well......I've been working hard on fixing this in a smart way with all the ingredients I've been learning about. Thank you to SEOMozers for all the help!! There's still plenty to do, especially in the link earning department, but I've come really far from where I was in the Fall. Anyway. I am here right now to report what may be true to life fantastic news. I was starting to suspect an improvement last week, but it proved to be wrong. Then, I saw another sign yesterday but couldn't trust it. Today, my latest SEOMoz report is showing me the following for the several keywords we lost position down to "not in the top 50" for. keyword 1: up 44 points to #6keyword 2: no change still at #4
White Hat / Black Hat SEO | | gfiedel
keyword 3: up 46 points to # 4
keyword 4: up 43 points to #7
keyword 5: up 46 points to #4
keyword 6: up 2 points to #2 What I'm wondering is if this is real. ;o). I'm pinching myself. I realize that it could be one of those sliding readjustment things and we'll drop back down, but we are not a new site. It seems that even if that is the case, it still must illustrate something good. Some kind of elimination of possibilities for why the drop occurred in the first place. I did a few things in this past week that may have put it over the tipping point. One of which was signing up for adwords a week ago. I'm happy to give details if anyone is interested. A few specific questions: 1. What might this be showing me?
2. We have about a 45% number of anchor text footer links in client sites (we're a web dev co) one or two of which are numbering in the hundreds have keywords in them and are continuing to generate more links due to ecomm and large databases. I was gearing up to remove them or get them moved out of the footer so there's only one, but now I'm afraid to touch anything. Most of the footer links are just our company name or "site design". Any suggestions? 3. any other bits of advice for this situation are appreciated. I don't want to blow it now! Thanks!0 -
Has Panda help this site achieve great heights? How? and Why?
Today I went about my business in trying to understand what is happening in our market, eyewear, after the last Panda update. I was interested to know if any of our competitors were effected as much as we were for a very competitive key phrase To my surprise a new kid appeared on the block, well, on page one, position two. Imagine my second surprise, when the new kid turn out to be a 3 month old domain, yes 3 months, with zero page rank and zero back links. I was in for one more surprise before I stood up, walked to the window and gazed into space to contenplate the meaning of Panda and SEO as we know it. This third surprise was the site in question is a counterfeiting site using black hat SEO with fast results. It has a Blog its a good looking site with the key phrase menstioned a hundred times. google-UK-%20Search-Result.jpg panda-help.jpg
White Hat / Black Hat SEO | | ShoutChris0 -
What's the best way to set up 301's from an old off-site subdomain to a new off-site subdomain?
We are moving our Online store to a new service and we need to create 301's for all of the old product URLs. Being that the old store was hosted off-site, what is the best way to handle the 301 re-directs? Thanks!
White Hat / Black Hat SEO | | VermilionDesignInteractive0 -
Showing pre-loaded content cloaking?
Hi everyone, another quick question. We have a number of different resources available for our users that load dynamically as the user scrolls down the page (like Facebook's Timeline) with the aim of improving page load time. Would it be considered cloaking if we had Google bot index a version of the page with all available content that would load for the user if he/she scrolled down to the bottom?
White Hat / Black Hat SEO | | CuriosityMedia0