Panda Cleanup - Removing Old Blog Posts, Let Them 404 or 301 to Main Blog Page?
-
tl;dr... Removing old blog posts that may be affected by Panda, should we let them 404 or 301 to the Blog?
We have been managing a corporate blog since 2011. The content is OK but we've recently hired a new blogger who is doing an outstanding job, creating content that is very useful to site visitors and is just on a higher level than what we've had previously. The old posts mostly have no comments and don't get much user engagement. I know Google recommends creating great new content rather than removing old content due to Panda concerns but I'm confident we're doing the former and I still want to purge the old stuff that's not doing anyone any good.
So let's just pretend we're being dinged by Panda for having a large amount of content that doesn't get much user engagement (not sure if that's actually the case, rankings remain good though we have been passed on a couple key rankings recently). I've gone through Analytics and noted any blog posts that have generated at least 1 lead or had at least 20 unique visits all time. I think that's a pretty low barrier and everything else really can be safely removed.
So for the remaining posts (I'm guessing there are hundreds of them but haven't compiled the specific list yet), should we just let them 404 or do we 301 redirect them to the main blog page? The underlying question is, if our primary purpose is cleaning things up for Panda specifically, does placing a 301 make sense or would Google see those "low quality" pages being redirected to a new place and pass on some of that "low quality" signal to the new page? Is it better for that content just to go away completely (404)?
-
Thanks, this is very helpful. I love the idea of having the new blogger write posts about the same topics thereby getting some much more engaging content at the URLs that already have traffic coming in.
-
If this was my site, I would look at analytics to see if any of the old posts are bringing in traffic, then ask your current blogger if he/she sees topics that will be useful and that he/she is excited to write about, then improve those pages without changing URL.
After that is done, if this was my site, I would 301 redirect the pages that will be deleted to the homepage of the blog.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Help: Blog post translations resulting in 404 Not Found?
A client set up a website that has multilingual functionality (WPML) and the back end is a bit of a mess. The site has around 6 translated versions of the 30 or so existing English blog posts in French, Italian and Spanish - all with their own URLs. The problem is that on the remaining 24 English blog posts, the language changer in the header is still there - even though the majority of posts have not been translated - so when you go to change the language to French, it adds **?lang=fr **onto the existing english URL, and is a page not found (4xx client error). I can't redirect anything because the page does not exist. Is there a way to stop this from happening? I have noticed it's also creating italian/french/spanish translation of the english Categories too. Thanks in advance.
Technical SEO | | skehoe0 -
Has anyone had problems with Wordpress plugins on their blog causing payment issues on the main site?
Looking to migrate a subdomain Wordpress site onto the main domain, but the payment system breaks based on one or more of the plugins used on the blog having been linked with spammy activity in the past. Need to isolate the plugin and remove before migrating or it'll break the site! Has anyone had any similar issues with some of the following plugins? Akismet Wordfence Security Subscribe2 Timber Backup Buddy
Technical SEO | | Amelia.Coleby0 -
301 redirect with Magento; still Page authority 0 after 6 weeks
Hi Mozzers! In December '14 I have execute a 301 redirect in the 'old' page in the admin of my Magento store. Now I was surprised to see that the Page Authority is still 0 of the new page 6 weeks after the execution. should I have seen the update of the PA on the new page already after 6 weeks of time? If yes, then I assume that my Magento didn't execute this properly? Old url: http://hippemamashop.nl/mama/boeken/fotoalbum.html
Technical SEO | | aznventure
New page: you will be redirected to the new page after clicking the old page Mm4uBEl0 -
Can I speed up removal of cache for 301'd page on unverified website?
I recently asked another website to remove a page from their website (I have no control over this website) and they have now 301'd this old URL to another - this is just what I wanted. My only aim now is to see the Google cache removed for that page as quickly as possible.
Technical SEO | | Mark_Reynolds
I'm not sure that asking the website to remove the url via WMT is the right way to go and assume I should just be waiting for Google to pick up the 301 and naturally remove the cache. But are there any recommended methods I can use to speed this process up? The old URL was last cached on 3 Oct 2014 so not too long ago. I don't think the URL is linked from any other page on the Internet now, but I guess it would still be in Google's list of URLs to crawl. Should I sit back and wait (who knows how long that would take?) or would adding a link to the old URL from a website I manage speed things up? Or would it help to submit the old URL to Google's Submission tool? URL0 -
How to Stop Google from Indexing Old Pages
We moved from a .php site to a java site on April 10th. It's almost 2 months later and Google continues to crawl old pages that no longer exist (225,430 Not Found Errors to be exact). These pages no longer exist on the site and there are no internal or external links pointing to these pages. Google has crawled the site since the go live, but continues to try and crawl these pages. What are my next steps?
Technical SEO | | rhoadesjohn0 -
Noindex vs. page removal - Panda recovery
I'm wondering whether there is a consensus within the SEO community as to whether noindexing pages vs. actually removing pages is different from Google Pandas perspective?Does noindexing pages have less value when removing poor quality content than physically removing ie. either 301ing or 404ing the page being removed and removing the links to it from the site? I presume that removing pages has a positive impact on the amount of link juice that gets to some of the remaining pages deeper into the site, but I also presume this doesn't have any direct impact on the Panda algorithm? Thanks very much in advance for your thoughts, and corrections on my assumptions 🙂
Technical SEO | | agencycentral0 -
Unnecessary pages getting indexed in Google for my blog
I have a blog dapazze.com and I am suffering from a problem for a long time. I found out that Google have indexed hundreds of replytocom links and images attachment pages for my blog. I had to remove these pages manually using the URL removal tool. I had used "Disallow: ?replytocom" in my robots.txt, but Google disobeyed it. After that, I removed the parameter from my blog completely using the SEO by Yoast plugin. But now I see that Google has again started indexing these links even after they are not present in my blog (I use #comment). Google have also indexed many of my admin and plugin pages, whereas they are disallowed in my robots.txt file. Have a look at my robots.txt file here: http://dapazze.com/robots.txt Please help me out to solve this problem permanently?
Technical SEO | | rahulchowdhury0 -
Page not Accesible for crawler in on-page report
Hi All, We started using SEOMoz this week and ran into an issue regarding the crawler access in the on-page report module. The attached screen shot shows that the HTTP status is 200 but SEOMoz still says that the page is not accessible for crawlers. What could this be? Page in question
Technical SEO | | TiasNimbas
http://www.tiasnimbas.edu/Executive_MBA/pgeId=307 Regards, Coen SEOMoz.png0