Can I use content from an existing site that is not up anymore?
-
I want to take down a current website and create a new site or two (with new url, ip, server). Can I use the content from the deleted site on the new sites since I own it? How will Google see that?
-
Thank you. That is a great answer!
-
Hi there,
I would say that, taking William's point into account, canonicals might work in order to remove any possibility that Google would see the new site as copying the old one. That said, I can't guarantee that they could not either manually or automatically (manually would be much easier) note that the two sites are owned by the same person and that the domain change is a measure taken to avoid a penalty. The truly safest thing to do is to re-write the content and start afresh. The next safest is to remove the content from the old site, force a re-crawl / wait for Google to update its cache of the old site excluding the content, and then re-publish on the new site.
Canonicals will make this process quicker, but I don't believe it can be guaranteed that they won't result in Google making a stronger connection between the two sites, which might not go well. Again, this is only if there are enough similarities for Google to understand that this is not a scraper / scrapee situation but a situation where one entity owns both sites.
I'm sorry not to give a definitive answer.
-
After reading Jane & William's discussion--do you both agree that canonicals is the way to go? The site will be similar (trying to create a non-penalized site). The sites will have different ip's and servers but a lot of the same content. None of the same backlinks... I just don't want to do the work if it's going to end up hurting me worse. I don't see how I can get all those bad backlinks removed.
-
Really good point. Taking that into account, I might guess that an anti-manipulation method Google might employ is to grab registration details, hosting data, analytics codes, etc. and other identifying factors to determine whether the canonicalised content is owned by the same person. That is, canonicals between tightly-linked sites where the "duplicate" is penalised could hurt the canonical source, stopping people using this in place of the old 301 trick. If the scraper site has nothing in common with the source, Google does not pass on any negative metric from the duplicate.
This is just a theory too of course! I'd be confident assuming that they're taking precautions to stop this becoming a common trick. Awesome point!
-
The thought behind canonicals is this:
-
One of their uses is to fight against scrapers and such by still having the canonical tags in place when these spammy places grab your content.
-
If penalties passed through canonicals, then the penalties these scrapers have would effect your site terribly. This is not the case, in my experience.
-
So, unless Google has already implemented the human tracking that was discussed a few Whiteboard Fridays ago, this should work. And even with hardcore human tracking for penalities, I think its yet to be seen if this would focus on small sites trying to fix penalities as opposed to the large black hat spammers.
There is a bit of theorycrafting here, but in RoxBrock's specific situation, it looks like he has to pick the lesser of all evils.
-
-
The idea of using canonicals interests me, but I am not 100% sure it is risk-free. It used to be the case that you could 301 penalised websites and remove the penalty (we're talking 2010 and earlier here). Google is very keen on transferring penalties these days, so I would be surprised if they are leaving a loophole for canonical tags open like this, or if they will keep that loophole open for long.
You would ideally leave the site live and remove its content as William says - once you see that the cached version of the site no longer contains the content you want to move, you can feel free to take the old site down and put the content up on the new site.
We don't know what lengths Google is going to or will go to to avoid people being able to re-use previously penalised content (including good content from penalised websites) but the safest thing you can do whilst using this old content right now is ensure the old content has been deindexed before putting it up again elsewhere.
The actual safest thing you can do is re-write the content, but I realise this might not be possible.
-
Put the canonical tags in the old content, and point it to the new pages.
If you believe there are penalties, then 301ing is a little risky.
De-indexing content doesn't mean Google forgets it was there, they still have it cached, so this isn't ideal.
It looks like canonical may be your best bet.
-
So you suggest leaving the old site up and add the content to the new site with the canonical tag pointing to old site? Any other options you can think of?
-
You would need to keep the site live to speed up the de-indexation. Then block all bots through robots.txt and force a crawl.
Make sure this is what you want to do. There are other options for this situation depending on your intent. Canonical tags, for example, would not transfer penalties and still show Google where the good source of the content is.
-
Many bad links were built on the old website by a questionable SEO firm, so I do believe the URL has been hit, but not with a formal penalty.
In order to redirect the old web pages I would need to keep the website live which really does not serve my purpose--which is to use great content that was written in-house on a clean website with no backlinks (starting from scratch).
How would one go about "de-indexing" content?
Thank you for prompt responses.
-
301 redirect the old web pages to the new ones using an .htaccess file on the old website. This will show Google that the content has moved to the new web pages. Check out the link for more information: http://moz.com/learn/seo/redirection
-
Interesting question!
I had to do some research on this, there is not much out there. One place I was sure to find and answer was the depths of the underworld in blackhat forums. I found a whole discussion on it from 6 months back. (Not going to link to a black hat site, sorry)
However what they said and had tried and tested was that the site must be de-indexed and the same for all pages so that it did not trip the duplicate content.
However lets back things up a little. Why are you doing this? Does the original have a penalty?
Why not keep the original live and put a canonical link in your page pointing to the new site stating that is the original content owner? this way you will get traffic right away and not have to start ranking from scratch.
Need to know more about your reasons please.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Do content copycats (plagiarism) hurt original website rankings?
Hi all, Found some websites stolen our content and using the same sentences in their website pages. Does this content hurt our website rankings? Their DA is low, still we are worried about the damage about this plagiarism. Thanks
White Hat / Black Hat SEO | | vtmoz0 -
Site traffic halved not sure why
Hi guys, not sure if anyone can help, but we had a client's google organic traffic literally halve from the week at the end of August to September (29 Aug 2016 to be precise) and it hasn't recovered since (here's a screenshot from GA http://puu.sh/sAmd3/b071dd1e57.png) I've been doing a lot of digging around on Moz and elsewhere about any Google updates that may have gone through around that time and there doesn't seem to be anything that I would think would affect it. I thought it might be to do with Penguin, but that doesn't seem to be the case. A while ago before then we did have some domains and pages 301 redirected to the main site when multiple other sites were rolled into the one, but I wouldn't have thought that should affect it. After that I've also gone and removed all those sites and redirects too (couple of weeks ago) but that doesn't seem to have fixed it. There's no black hat SEO done on the site so very odd to have this happen. I'm rather out of ideas what it could be that has impacted things so suddenly and that we couldn't get it recovered from. Any ideas would be much appreciated.
White Hat / Black Hat SEO | | BrisbaneSEOWorks0 -
Whether to disavow fettish sites
Hello, In one niche, all competitors have fettish backlinks. Some of these sites have related products on them, some are just information, but some border on porn sites. I'm wondering which if not all of these I should disavow. There's quite a few. We're doing a non-manual penguin recovery based on link building like paid links, unnatural anchor text and doorway sites. Thanks.
White Hat / Black Hat SEO | | BobGW0 -
Removing duplicated content using only the NOINDEX in large scale (80% of the website).
Hi everyone, I am taking care of the large "news" website (500k pages), which got massive hit from Panda because of the duplicated content (70% was syndicated content). I recommended that all syndicated content should be removed and the website should focus on original, high quallity content. However, this was implemented only partially. All syndicated content is set to NOINDEX (they thing that it is good for user to see standard news + original HQ content). Of course it didn't help at all. No change after months. If I would be Google, I would definitely penalize website that has 80% of the content set to NOINDEX a it is duplicated. I would consider this site "cheating" and not worthy for the user. What do you think about this "theory"? What would you do? Thank you for your help!
White Hat / Black Hat SEO | | Lukas_TheCurious0 -
Content within a toggle, Juice or No Juice?
Greetings Mozzers, I recently added a significant amount of information within a single page utilizing toggles to hide the content from a user and for them to see it they must click to reveal. Since technically the code is reading "display:none" to start, would that be considered "Black Hat" or "Not There" to crawlers? It isn't displayed in any sort of spammy way. It is more for the UX of the visitor that toggles were utilized. Thoughts and advice is greatly appreciated!
White Hat / Black Hat SEO | | MonsterWeb280 -
How does Google decide what content is "similar" or "duplicate"?
Hello all, I have a massive duplicate content issue at the moment with a load of old employer detail pages on my site. We have 18,000 pages that look like this: http://www.eteach.com/Employer.aspx?EmpNo=26626 http://www.eteach.com/Employer.aspx?EmpNo=36986 and Google is classing all of these pages as similar content which may result in a bunch of these pages being de-indexed. Now although they all look rubbish, some of them are ranking on search engines, and looking at the traffic on a couple of these, it's clear that people who find these pages are wanting to find out more information on the school (because everyone seems to click on the local information tab on the page). So I don't want to just get rid of all these pages, I want to add content to them. But my question is... If I were to make up say 5 templates of generic content with different fields being replaced with the schools name, location, headteachers name so that they vary with other pages, will this be enough for Google to realise that they are not similar pages and will no longer class them as duplicate pages? e.g. [School name] is a busy and dynamic school led by [headteachers name] who achieve excellence every year from ofsted. Located in [location], [school name] offers a wide range of experiences both in the classroom and through extra-curricular activities, we encourage all of our pupils to “Aim Higher". We value all our teachers and support staff and work hard to keep [school name]'s reputation to the highest standards. Something like that... Anyone know if Google would slap me if I did that across 18,000 pages (with 4 other templates to choose from)?
White Hat / Black Hat SEO | | Eteach_Marketing0 -
Sponsoredreviews.com , anyone ever used it?
I came across this site http://www.sponsoredreviews.com/, thought its idea was a place were you can offer your product to be reviewed by bloggers, (fairly white hat I would have thought), I had a quick look and it seemed to me its for for selling back links on blogs, but before I dismissed it completely I just wanted to see if anyone else had any experience with it? Update: if this website is no good, are there any genuine places were you can offer you products for review?
White Hat / Black Hat SEO | | PaddyDisplays0 -
How to recover my site from -50 penalty
One of my sites was hit after Google confirmed its panda 3.2 update. The site ranked very well for many heavy traffic keywords in my niche. But all of a sudden, 80% of the keywords which ranked high in the previous dropped 50 in SERP. I know it is a -50 penalty , but i do not know how to recover from it. The link building campaign is almost the same as before and all of the articles are unique. BTW, i have two image ads on the sidebar and 7 affiliate links on the bottom of the page. Any input will be great appreciated !
White Hat / Black Hat SEO | | aoneshosesun0