Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Is it a good idea to remove old blogs?
-
So I have a site right now that isn't ranking well, and we are trying everything to help it out. One of my areas of concern is we have A LOT of old blogs that were not well written and honestly are not overly relevant. None of them rank for anything, and could be causing a lot of duplicate content issues. Our newer blogs are doing better and written in a more Q&A type format and it seems to be doing better.
So my thought is basically wipe out all the blogs from 2010-2012 -- probably 450+ blog posts.
What do you guys think?
-
You may find this case study helpful of a blog that decided to exactly that:
http://www.koozai.com/blog/search-marketing/deleted-900-blog-posts-happened-next/
-
It depends on what you mean by "remove."
If the content of all those old blogs truly is poor, I'd strongly consider going through 1 by 1 and seeing how you can re-write, expand upon, and improve the overall blog post. Can you tackle the subject from another angle? Are there images, videos, or even visual assets you can add to the post to make it more intriguing and sharable?
Then, you can seek out some credible places to strategically place your blog content for additional exposure and maybe even a link. Be careful here, however. I'm not talking about forum and comment spam, but there may be some active communities that are open to unique and valuable content. Do your research first.
When going through each post 1 by 1, you'll undoubtedly find blog posts that are simply "too far gone" or not relevant enough to keep. Essentially, it wouldn't even be worth your time to re-write them. In this case, find another page on your website that's MOST SIMILAR to the blog post. This may be in topic, but also could be an author's page, another blog post that is valuable, a contact page, etc. Then perform 301 redirects of the crap blog posts to those pages.
Not only are you salvaging any little value those blog posts may have had, but you're also preventing crawl and index issues by telling the search engine bots where that content is now (assuming it was indexed in the first place).
This is an incredibly long content process and should take you months. Especially if there's a lot of content that's good enough to be re-written, expanded upon, and added to. However making that content relevant and useful is the best thing you can do. It's a long process, but if your best content writers need a project, this would be it.
To recap: **1) **Go through each blog post 1 by 1, determine what's good enough to edit, what's "too far gone." 2) Re-write, edit, add to (content and images/videos) and re-promote them socially and to appropriate audiences and communities. 3) For the posts that were "too far gone," 301 redirect them to the most relevant posts and pages that are remaining "live."
Again, I can say firsthand that this is a LONG process. I've done it for a client in the past. However, the return was well worth the work. And by doing it this way and not just deleting posts, you're preventing yourself a lot of crawl/index headaches with the search engines.
-
we have A LOT of old blogs that were not well written and honestly are not overly relevant.
Wow.... it is great to hear someone looking at their content and deciding that he can kick it up a notch. I have seen a lot of people would never, ever, pull the kill switch on an old blog post. In fact they are still out there hiring people to write stuff that is really crappy.
If this was my site I would first check to be sure that I don't have a penguin or unnatural links problem. If you think you are OK there, here is what I would do.
-
I would look at those blog posts to see if any of them have any traffic, link or revenue value. Value is defined as... A) Traffic from any search engine or other quality source, B) valuable links, C) viewing by current website visitors, D) traffic who enter through those pages making any income through ads or purchases.
-
If any of them pass the value test above then I would improve that page. I would put a nice amount of work into that page.
-
Next I would look at each of those blog posts and see if any have content value. That means an idea that could be developed into valuable content... or valuable content that could be simply rewritten to a higher standard. Valuable content is defined as a topic that might pull traffic from search or be consumed by current site visitors.
-
If any pass the valuable content test then I would improve them. I would make them kickass.
-
After you have done the above, I would pull the plug on everything else.... or if I was feeling charitable I would offer them to a competitor.

Salutes to you for having the courage to clean some slates.
-
-
I would run them through Copyscape to check for plagiarism/duplicate content issues. After that, I would check for referral traffic. If there are some pages that draw enough traffic, you might not want to remove them. Finally, round it off with a page level link audit. Majestic can give you a pretty good idea of where they stand.
The pages that don't make the cut should be set to throw 410 status codes. If you still don't like the content on pages with good links and/or referral traffic, 301 those to better content on the same subject.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Removing the Trailing Slash in Magento
Hi guys, We have noticed trailing slash vs non-trailing slash duplication on one of our sites. Example:
Intermediate & Advanced SEO | | brandonegroup
Duplicate: https://www.example.com.au/living/
Preferred: https://www.example.com.au/living So, SEO-wise, we suggested placing a canonical tag on all trailing slash pointing to non-trailing slash. However, devs have advised against removing the trailing slash from some URLs with a blanket rule, as this may break functionality in Magento that depends on the trailing slash. The full site would need to be tested after implementing a blanket rewrite rule. Is any other way to address this trailing slash duplication issue without breaking anything in Magento? Keen to hear from you guys. Cheers,0 -
Removing .html from URLs - impact of rankings?
Good evening Mozzers. Couple of questions which I hope you can help with. Here's the first. I am wondering, are we likely to see ranking changes if we remove the .html from the sites URLs. For example website.com/category/sub-category.html Change to: website.com/category/sub-category/ We will of course make sure we 301 redirect to the new, user friendly URLs, but I am wondering if anyone has had previous experience of implementing this change and how it has effected rankings. By having the .html in the URLs, does this stop link juice being flowed back to the root category? Second question: If one page can be loaded with and without a forward slash "/" at the end, is this a duplicate page, or would Google consider this as the same page? Would like to eliminate duplicate content issues if this is the case. For example: website.com/category/ and website.com/category Duplicate content/pages?
Intermediate & Advanced SEO | | Jseddon920 -
Removing UpperCase URLs from Indexing
This search - site:www.qjamba.com/online-savings/automotix gives me this result from Google: Automotix online coupons and shopping - Qjamba
Intermediate & Advanced SEO | | friendoffood
https://www.qjamba.com/online-savings/automotix
Online Coupons and Shopping Savings for Automotix. Coupon codes for online discounts on Vehicles & Parts products. and Google tells me there is another one, which is 'very simliar'. When I click to see it I get: Automotix online coupons and shopping - Qjamba
https://www.qjamba.com/online-savings/Automotix
Online Coupons and Shopping Savings for Automotix. Coupon codes for online discounts on Vehicles & Parts products. This is because I recently changed my program to redirect all urls with uppercase in them to lower case, as it appears that all lowercase is strongly recommended. I assume that having 2 indexed urls for the same content dilutes link juice. Can I safely remove all of my UpperCase indexed pages from Google without it affecting the indexing of the lower case urls? And if, so what is the best way -- there are thousands.0 -
NoIndexing Massive Pages all at once: Good or bad?
If you have a site with a few thousand high quality and authoritative pages, and tens of thousands with search results and tags pages with thin content, and noindex,follow the thin content pages all at once, will google see this is a good or bad thing? I am only trying to do what Google guidelines suggest, but since I have so many pages index on my site, will throwing the noindex tag on ~80% of thin content pages negatively impact my site?
Intermediate & Advanced SEO | | WebServiceConsulting.com0 -
Limit on Google Removal Tool?
I'm dealing with thousands of duplicate URL's caused by the CMS... So I am using some automation to get through them - What is the daily limit? weekly? monthly? Any ideas?? thanks, Ben
Intermediate & Advanced SEO | | bjs20100 -
Removing Content 301 vs 410 question
Hello, I was hoping to get the SEOmoz community’s advice on how to remove content most effectively from a large website. I just read a very thought-provoking thread in which Dr. Pete and Kerry22 answered a question about how to cut content in order to recover from Panda. (http://www.seomoz.org/q/panda-recovery-what-is-the-best-way-to-shrink-your-index-and-make-google-aware). Kerry22 mentioned a process in which 410s would be totally visible to googlebot so that it would easily recognize the removal of content. The conversation implied that it is not just important to remove the content, but also to give google the ability to recrawl that content to indeed confirm the content was removed (as opposed to just recrawling the site and not finding the content anywhere). This really made lots of sense to me and also struck a personal chord… Our website was hit by a later Panda refresh back in March 2012, and ever since then we have been aggressive about cutting content and doing what we can to improve user experience. When we cut pages, though, we used a different approach, doing all of the below steps:
Intermediate & Advanced SEO | | Eric_R
1. We cut the pages
2. We set up permanent 301 redirects for all of them immediately.
3. And at the same time, we would always remove from our site all links pointing to these pages (to make sure users didn’t stumble upon the removed pages. When we cut the content pages, we would either delete them or unpublish them, causing them to 404 or 401, but this is probably a moot point since we gave them 301 redirects every time anyway. We thought we could signal to Google that we removed the content while avoiding generating lots of errors that way… I see that this is basically the exact opposite of Dr. Pete's advice and opposite what Kerry22 used in order to get a recovery, and meanwhile here we are still trying to help our site recover. We've been feeling that our site should no longer be under the shadow of Panda. So here is what I'm wondering, and I'd be very appreciative of advice or answers for the following questions: 1. Is it possible that Google still thinks we have this content on our site, and we continue to suffer from Panda because of this?
Could there be a residual taint caused by the way we removed it, or is it all water under the bridge at this point because Google would have figured out we removed it (albeit not in a preferred way)? 2. If there’s a possibility our former cutting process has caused lasting issues and affected how Google sees us, what can we do now (if anything) to correct the damage we did? Thank you in advance for your help,
Eric1 -
How does Google know if a backlink is good or not?
Hi, What does Google look at when assessing a backlink? How important is it to get a backlink from a website with relevant content? Ex: 1. Domain/Page Auth 80, website is not relevant. Does not use any of the words in your target term in any area of the website. 2. Domain/Page Auth 40, website is relevant. Uses the words in your target term multiple times across website. Which website example would benefit your SERP's more if you gained a backlink? (and if you can say, how much more would it benefit - low, medium, high).
Intermediate & Advanced SEO | | activitysuper0 -
Should pages of old news articles be indexed?
My website published about 3 news articles a day and is set up so that old news articles can be accessed through a "back" button with articles going to page 2 then page 3 then page 4, etc... as new articles push them down. The pages include a link to the article and a short snippet. I was thinking I would want Google to index the first 3 pages of articles, but after that the pages are not worthwhile. Could these pages harm me and should they be noindexed and/or added as a canonical URL to the main news page - or is leaving them as is fine because they are so deep into the site that Google won't see them, but I also won't be penalized for having week content? Thanks for the help!
Intermediate & Advanced SEO | | theLotter0