What Should I Do With Low Quality Content?
-
As my site has definitely got hit by Panda, I am in the process of cleaning my website of low quality content.
Needless to say, shitty articles are completed being removed but I think lots of this content is now of low quality because it is obsolete and dated.
So what should I do with this content?
Should I rewrite those articles as completely new posts and link from the old posts to the new ones? Or should I delete the old posts and do a 301 redirect to the new post?
Or should I rewrite the content of these articles in place so I can keep the old URL and backlinks?
One thing is that I've got a lot more followers than I used to so publishing a new post gets a lot more views, like and shares and whatnot from social networks.
-
I wouldn't re-write old posts. If they can be refreshed or added to with recent updates go ahead and redirect (if it can be redirected without losing any additional info) or link to the new version.
Things get tricky if there's nothing new that can be written about the post. First, kill the really bad stuff, as Mike suggested, and keep the good stuff. The stuff on the borderline is probably not worth keeping unless it was still receiving traffic. In my experience with Panda, using 410s on bad pages is better than redirecting, but you will probably want to 301 redirect to the next-best page if you have good links.
If it was still receiving organic traffic, think about what you can do to make it better or provide additional resources and reading. Try to save traffic-generating pieces by improving them and making them useful to the people who were landing on them. For high-traffic pieces, you will want to look at the organic keywords and make sure the page somehow answers the query.
As always with Panda, make sure your design doesn't turn people off and that you're not filling the template with too many ads.
-
No 404's are fine they will just not pass link juice and Google will eventually stop following them but ideally they will phase out regardless as the internet moves along.
-
And what about real crappy content? If I delete it, I end up with lots of 404 errors. Does that pose a problem to Google?
-
In this case, would you change the date to post it as "new" content? Because even if I rewrite it, I can't post an article from 2008 to the website's Facebook page.
-
I think that it all depends on how bad the content is. If you have content that is complete and total crap (10+ instances of the same keyword, reads like a toddler wrote it, etc.), it is better just to kill it and redirect your pages elsewhere. On the other hand, if the content is salvageable, then take the time to re-write it and make it good. The benefit of this is that at the end of the day you have good content instead of a bunch of links re-directed to pages that don't necessarily have anything to do with the old content.
Good luck!
P.S. Don't forget the disavow tool if you need it!
-
There is absolutely nothing wrong with multiple 301's pointing to the same page.
-
What if I have two dated articles that I merge into one updated article. Does it matter if I have two 301 redirects to the same URL?
-
with a 301 redirect? that's gonna be a huge htaccess file
-
Kill it and redirect if there are any backlinks incoming. Definitely.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate Content - Bulk analysis tool?
Hi I wondered if there's a tool to analyse duplicate content - within your own site or on external sites, but that you can upload the URL's you want to check in bulk? I used Copyscape a while ago, but don't remember this having a bulk feature? Thank you!
On-Page Optimization | | BeckyKey0 -
How does Indeed.com make it to the top of every single search despite of having aggregated content or duplicate content
How does Indeed.com make it to the top of every single search despite of having duplicate content. I mean somewhere google says they will prefer original content & will give preference to them who have original content but this statement contradict when I see Indeed.com as they aggregate content from other sites but still rank higher than original content provider side. How does Indeed.com make it to the top of every single search despite of having aggregated content or duplicate content
On-Page Optimization | | vivekrathore0 -
Keeping SEO benefit of an old URL by changing content
We have a blog written in Oct 2012 that accounts for 30-40% of our traffic (174K pageviews per year/80% bounce rate). We are considering updating the content but are concerned that it will fall off the search engine's map if the content is updated to include information that is not exactly the same, but relevant. The URL would be the same and the original blog content would be shortened with a link to the full blog. The new content would include other FDA products under investigation. Here is the blog: http://myadvocates.com/blog/fda-issues-warning-about-so-called-brain-supplement-prevagen
On-Page Optimization | | jgodwin0 -
Duplicate content on domains we own
Hello! We are new to SEO and have a problem we have caused ourselves. We own two domains GoCentrix.com (old domain) and CallRingTalk.com (new domain that we want to SEO). The content was updated on both domains at about the same time. Both are identical with a few exceptions. Now that we are getting into SEO we now understand this to be a big issue. Is this a resolvable matter? At this point what is the best approach to handle this? So far we have considered a couple of options. 1. Change the copy, but on which site? Is one flagged as the original and the other duplicate? 2. Robots.txt noindex, nofollow on the old one. Any help is appreciated, thanks in advance!
On-Page Optimization | | CallRingTalk0 -
Content Update
Hello, If I update the existing content i.e.I added some content to the already existing indexed content in a post,how will it effect SEO wise? Venkee
On-Page Optimization | | Venkee0 -
How to SEO a website that is being help back by duplicate content?
We have over 20 websites that sell property. Each website is targeted to a different country. People advertise to sell their property. The websites are not getting to page 1 for the terms we want probably because of duplication issues. If we compare one website with another country website on www.duplicatecontent.net we find it is nearly 70% between one and the other. So we trying to understand why this is. If someone wanted to sell a property in Spain we would create an advert for them but rather than putting this on the back-end of the Spain website it goes on a separate website that does on all countries. We have tried to put nofollow tags so that the country specific website gets acknowledgement of being the original website but the rankings for key-terms will not rise and the duplication % remains nearly 70%. Can anyone suggest the best way forward?
On-Page Optimization | | Feily0 -
Should I use this Facebook comment content on my related blog post?
I have a blog post that ranks pretty high for the term "justin bieber tickets". We are running a ticket giveaway and have received tons of responses on Facebook and G+. The responses are often poorly written in they sense that they are from younger fans, but it is a bunch of related content that I thought could be a "good "addition of unique content to the post. Is this a good idea in general? Is it still a good idea if the comments are poorly written and contain lots of slang an exclamation points? Is it bad form to put people's Facebook comments live on the web, even though it is a public page. Here is the post Example of what this would look like in the post >http://cl.ly/1Q3N0t091V0w3m2r442G Source of comments >http://www.facebook.com/SeatGeek Another less aggressive option would be to curate some of my favorite comments... Thanks for any thoughts.
On-Page Optimization | | chadburgess0 -
Creating optimized content: how to standardize the process?
Hello there, we are creating the new content for a website. For each web page we have created a “Pages file” to have the advantage of the spell checker. For each page, in the “Pages file” we have written the title tag (70 characters) and the meta description (155 character), so we have a kind of “template” like this in every page: title tag meta desciption text content (included the alt of the images inside the text) Every page is optimized for a single keyword/keyword phrase. What we wanna know from you guys if does exist a kind of “best practice” to test keyword density to avoid keyword stuffing penalities. In our case we opted to use “Pages” as editor, does exist a “standard Numbers/Excel spreadsheet” to understand if a keyword is over optimized in a page and so might look spammy? And in your opinion guys, what’s the best way to standardize the process of creating optimized content? Take care and thank you in advance for sharing your experience. YESdesign guys.
On-Page Optimization | | YESdesign0