What Should I Do With Low Quality Content?
-
As my site has definitely got hit by Panda, I am in the process of cleaning my website of low quality content.
Needless to say, shitty articles are completed being removed but I think lots of this content is now of low quality because it is obsolete and dated.
So what should I do with this content?
Should I rewrite those articles as completely new posts and link from the old posts to the new ones? Or should I delete the old posts and do a 301 redirect to the new post?
Or should I rewrite the content of these articles in place so I can keep the old URL and backlinks?
One thing is that I've got a lot more followers than I used to so publishing a new post gets a lot more views, like and shares and whatnot from social networks.
-
I wouldn't re-write old posts. If they can be refreshed or added to with recent updates go ahead and redirect (if it can be redirected without losing any additional info) or link to the new version.
Things get tricky if there's nothing new that can be written about the post. First, kill the really bad stuff, as Mike suggested, and keep the good stuff. The stuff on the borderline is probably not worth keeping unless it was still receiving traffic. In my experience with Panda, using 410s on bad pages is better than redirecting, but you will probably want to 301 redirect to the next-best page if you have good links.
If it was still receiving organic traffic, think about what you can do to make it better or provide additional resources and reading. Try to save traffic-generating pieces by improving them and making them useful to the people who were landing on them. For high-traffic pieces, you will want to look at the organic keywords and make sure the page somehow answers the query.
As always with Panda, make sure your design doesn't turn people off and that you're not filling the template with too many ads.
-
No 404's are fine they will just not pass link juice and Google will eventually stop following them but ideally they will phase out regardless as the internet moves along.
-
And what about real crappy content? If I delete it, I end up with lots of 404 errors. Does that pose a problem to Google?
-
In this case, would you change the date to post it as "new" content? Because even if I rewrite it, I can't post an article from 2008 to the website's Facebook page.
-
I think that it all depends on how bad the content is. If you have content that is complete and total crap (10+ instances of the same keyword, reads like a toddler wrote it, etc.), it is better just to kill it and redirect your pages elsewhere. On the other hand, if the content is salvageable, then take the time to re-write it and make it good. The benefit of this is that at the end of the day you have good content instead of a bunch of links re-directed to pages that don't necessarily have anything to do with the old content.
Good luck!
P.S. Don't forget the disavow tool if you need it!
-
There is absolutely nothing wrong with multiple 301's pointing to the same page.
-
What if I have two dated articles that I merge into one updated article. Does it matter if I have two 301 redirects to the same URL?
-
with a 301 redirect? that's gonna be a huge htaccess file
-
Kill it and redirect if there are any backlinks incoming. Definitely.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google ranking content for phrases that don't exist on-page
I am experiencing an issue with negative keywords, but the “negative” keyword in question isn’t truly negative and is required within the content – the problem is that Google is ranking pages for inaccurate phrases that don’t exist on the page. To explain, this product page (as one of many examples) - https://www.scamblermusic.com/albums/royalty-free-rock-music/ - is optimised for “Royalty free rock music” and it gets a Moz grade of 100. “Royalty free” is the most accurate description of the music (I optimised for “royalty free” instead of “royalty-free” (including a hyphen) because of improved search volume), and there is just one reference to the term “copyrighted” towards the foot of the page – this term is relevant because I need to make the point that the music is licensed, not sold, and the licensee pays for the right to use the music but does not own it (as it remains copyrighted). It turns out however that I appear to need to treat “copyrighted” almost as a negative term because Google isn’t accurately ranking the content. Despite excellent optimisation for “Royalty free rock music” and only one single reference of “copyrighted” within the copy, I am seeing this page (and other album genres) wrongly rank for the following search terms: “free rock music”
On-Page Optimization | | JCN-SBWD
“Copyright free rock music"
“Uncopyrighted rock music”
“Non copyrighted rock music” I understand that pages might rank for “free rock music” because it is part of the “Royalty free rock music” optimisation, what I can’t get my head around is why the page (and similar product pages) are ranking for “Copyright free”, “Uncopyrighted music” and “Non copyrighted music”. “Uncopyrighted” and “Non copyrighted” don’t exist anywhere within the copy or source code – why would Google consider it helpful to rank a page for a search term that doesn’t exist as a complete phrase within the content? By the same logic the page should also wrongly rank for “Skylark rock music” or “Pretzel rock music” as the words “Skylark” and “Pretzel” also feature just once within the content and therefore should generate completely inaccurate results too. To me this demonstrates just how poor Google is when it comes to understanding relevant content and optimization - it's taking part of an optimized term and combining it with just one other single-use word and then inappropriately ranking the page for that completely made up phrase. It’s one thing to misinterpret one reference of the term “copyrighted” and something else entirely to rank a page for completely made up terms such as “Uncopyrighted” and “Non copyrighted”. It almost makes me think that I’ve got a better chance of accurately ranking content if I buy a goat, shove a cigar up its backside, and sacrifice it in the name of the great god Google! Any advice (about wrongly attributed negative keywords, not goat sacrifice ) would be most welcome.0 -
Ecommerce product page duplicate content
Hi, I know this topic has been covered in the past but I haven't been able to find the answers to this specific thing. So let's say on a website, all the product pages contain partial duplicate content - i.e. this could be delivery options or returning policy etc. Would this be classed as duplicate content? Or is this something that you would not get concerned about if it's let's say 5-10% of the content on the page? Or if you think this is something you'd take into consideration, how would you fix it? Thank you!
On-Page Optimization | | MH-UK0 -
Duplicate content with tagging and categories
Hello, Moz is showing that a site has duplicate content - which appears to be because of tags and categories. It is a relatively new site, with only a few blog publications so far. This means that the same articles are displayed under a number of different tags and categories... Is this something I should worry about, or just wait until I have more content? The 'tag' and 'category' pages are not really pages I would expect or aim for anyone to find in google results anyway. Would be glad to here any advice / opinions on this Thanks!
On-Page Optimization | | wearehappymedia1 -
How does Indeed.com make it to the top of every single search despite of having aggregated content or duplicate content
How does Indeed.com make it to the top of every single search despite of having duplicate content. I mean somewhere google says they will prefer original content & will give preference to them who have original content but this statement contradict when I see Indeed.com as they aggregate content from other sites but still rank higher than original content provider side. How does Indeed.com make it to the top of every single search despite of having aggregated content or duplicate content
On-Page Optimization | | vivekrathore0 -
SEO value of old press releases (as content)?
Howdy Moz Community, I'm working with a client on migrating content to a new site/CMS and am wondering whether anyone has thoughts on the value of old press releases. I'm familiar with the devaluation of press release links from early 2013, but I'm wondering more about their value as content. Does importing old press releases (3-5 years old) create contextual depth of content that has some value for the site as a whole (even though the news contained within is useless)? Or, do these old press releases just create clutter and waste time (in migration). The site has a wealth of additional content (articles and videos), so the press releases wouldn't be covering up for thin content. I'm just wondering whether there's any best practices or a general rule of thumb. Thanks!
On-Page Optimization | | MilesMedia0 -
Duplicate Content on Event Pages
My client has a pretty popular service of event listings and, in hope of gathering more events, they opened up the platform to allow users to add events. This works really well for them and they are able to garner a lot more events this way. The major problem I'm finding is that many event coordinators and site owners will take the copy from their website and copy and paste it, duplicating a lot of the content. We have editor picks that contain a lot of unique content but the duplicate content scares me. It hasn't hurt our page ranking (we have a page ranking of 7) but I'm wondering if this is something that we should address. We don't have the manpower to eliminate all the duplication but if we cut down the duplication would we experience a significant advantage over people posting the same event?
On-Page Optimization | | mattdinbrooklyn0 -
PDF's - Dupe Content
Hi I have some pdfs linked to from a page with little content. Hence thinking best to extract the copy from the pdf and have on-page as body text, and the pdf will still be linked too. Will this count as dupe content ? Or is it best to use a pdf plugin so page opens pdf automatically and hence gives page content that way ? Cheers Dan
On-Page Optimization | | Dan-Lawrence0 -
Does Google still see masked domains as duplicate content?
Older reads state the domain forwarding or masking will create duplicate content but Google has evolved quite a bit and I'm wondering if that is still the case? Not suggesting that a 301 is not the proper way to redirect something but my question is: Does Google still see masked domains as duplicate content? Is there any viable use for domain masking other than for affiliates?
On-Page Optimization | | TracyWeb0