Removing blog posts with little/thin content
-
We've got quite a lot (I'd say 75%) of our blog posts which I'd consider as low quality. Short content (500 words or less) with few shares, no backlinks & comments; most of which gets 0-2 unique views a day (however combined this adds up).
Will removing these pages provide an SEO benefit greater than the reduction in traffic from the removal of these pages?
I've heard the likes of Neil Patel/Brian Dean suggest so, however I'm scared it will provide the opposite as less content is indexed and I'll actually see traffic fall.
Sam
-
Sam,
If you can safely assume that the pages are not hurting you, let them stay. It's certainly not ideal to have a website loaded with thin content. But, as is the case with most small sites, the posts are likely to do you more good than harm, provided you're willing to show them some attention.
Here's a good strategy to deploy:
-
Find the top 10 posts, as judged by analyzing GA and against the topics you hope to rank for, then beef them up with additional text and graphics.
-
Republish the posts, listing them as "updated."
-
Share the posts via social, using a meaningful quote from each piece to draw interest and invite re-shares.
-
Continue sharing the posts in the following weeks, each time with new text.
-
Gauge the performance of each social share, then use this information to create additional headlines for new posts, in addition to using it to inform you of what content might draw the most interest.
-
Repeat the process with the next 10 posts.
When you have thin, poorly performing content on your site, you aren't able to learn enough about what you're doing right to make a sound call. So to create more content, even "better" content, is likely a mistake. The wise approach is to use the content you have to investigate additional content ideas that would better serve your audience. Through social media and additional traffic to your site, you should be able to better discern what pieces of content will provide the greatest benefit in the future.
Additionally, the old content is likely to perform much better as well.
RS
-
-
It's difficult to talk in terms of truevalue. Someone of them may provide some value, but they pale in comparison to the new blog posts we have lined up and in my opinion bring the blog down; personally I wouldn't be sad to see them go.
I think it's time to exterminate.
Sam
-
Do the contents of these blog posts provide any value at all to the reader? Are they written well, and would you actually be sad to see them go? If yes, then refer to my previous response on re-purposing them to create even better content with more SEO value.
If not, and you're just worried about SEO, I'd say be rid of them. Based on those stats.
-
Thanks all, from my analysis:
In the last twelve months:
376 pages (although I'd estimate 70 of these aren't pages)
104 pages have bounce rate of 100%
307 pages have less than 20 unique views (for the previous 12 months) but the total count for this would be 1,374
which is a sizable sum.So the question is, is it worth pulling all the pages below 20 unique views and all the 100% bounce rate pages from the site? Will it actually benefit our SEO or am I just making work for myself?
I'd love to hear from people who've actually seen positive SEO movements after removing thin pages.
-
It's a waste of good content to remove it because it's considered "thin". In your position, I would consider grouping these under-performing/thin blog posts into topical themes, compile and update them to create "epic content" in the form of detailed guides or whatever is most suitable to the content. Add to the long post so that there's some logical structure to the combining of the little posts (and so it doesn't just read as if you stuck multiple posts together), then redirect the old post URLs to the newly created relevant posts. Not only do you have fresh content that could each provide a ton of value to your readers, but the SEO value of these so-called "epic" posts should in theory be more impactful.
Good luck, whatever you decide to do!
-
My rule of thumb would be:
Take all pages offline, which have under 30 organic sessions per month.
Like Dmitrii already mentioned, check your past data for these posts and have a look at average sessions durations / bounce rates / pages per sessions, with which you can valdiate the "quality of the traffic". If there are posts which have decent stats - don't take them offline. Rather update them or write a new blog post about the topic and make a redirect. In this case have a look in GWT for the actual serach queries (maybe you find some useful new insights).
-
Hi there.
Are those blogs ranking anywhat for any related keyphrases? At the same time, how about bounce rate and time on page for those 2 visits a day? Are you sure those visits are not bots/crawlers?
We have done similar reduction about 6 months ago and we haven't seen any drop in rankings. The share of traffic to thin pages was pretty small and bounce rate was high, as well as time on page was very short. So, why to have anything which doesn't do any good?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Website Essentially Delisted and Top SERPS Removed
So what a morning.... Did a search for our website and looked to see how we were doing with a couple of search terms and it turns out our website is GONE. I mean not a slap down or fluctuations in positioning but we are literal gone. I see that the contact page is pulls up on page 8 but effectively we dont exist. This is so odd and strange that i feel that this is a dream. The fact that we dont pull up on any search terms anymore is of a concern that this is something more than a penalty. What on Earth could have caused this? This is fluke? Can i call Google? Seems that index page has been removed from the search results Website www.catdi.com Catdi Printing Search Terms Houston Printing Houston Direct Mail EDDM Printing EDDM Help! C
Reporting & Analytics | | CatdiPrinting0 -
Tracking links and duplicate content
Hi all, I have a bit of a conundrum for you all pertaining to a tracking link issue I have run into on a clients site. They currently have over duplicate content. Currently, they have over 15,000 pages being crawled (using Screaming Frog) but only 7,000+ are legitimate pages in the sense of they are not duplicates of themselves. The client is using Omniture instead of Google Analytics and using an advanced tracking system on their site for internal and external links (ictids and ectids) in the URL parameters. This is creating thousands of duplicated pages being crawled by Google (as seen on their Search Console and on Screaming Frog). They also are in the middle of moving over from http to https and have thousands of pages currently set up for both, again, creating a duplicate content issue. What I have suggested for the tracking links is setting up a URL parameter in Search Console for these tracking links. I've also suggested they canonical all tracking links to point to the clean page so the pages that have already been indexed point to the correct clean url. Does this seam like the appropriate strategy? Additionally, I've told them before they submit a new sitemap to Google, they need to switch their website over to https to avoid worsening their duplicate content issue. They have not submitted a sitemap to Google Search Console since March 2015. Thank you for any help you can offer!
Reporting & Analytics | | Rydch410 -
Best to Leave Toxic Links or Remove/Disovow on Site with Low Number of Linking Domains
Our site has only 87 referring domains (with at least 7,100 incoming links). LinkDetox has identified 29% of our back links as being toxic and 14% as being questionable. Virtually all of these links derive from spammy sites. We never received a manual penalty, but ever since the first Penguin penalty in 2012 our search volume and ranking has dropped with some uneven recover in the last 3 years. By removing/disavowing toxic links are we risking that over optimized link text will be removed and that ranking will suffer as a result? Are we potentially shooting ourselves in the foot? Would we be better to spend a few months building quality links from reputable domains before removing disavowing bad links? Or toxic links (as defined by LinkDetox) so bad that it should be a priority to remove them immediately before taking any other step? Thanks, Alan
Reporting & Analytics | | Kingalan10 -
Shall I 301 a Url that has a discontinued product or shall we remove from Googles index
My web site sells shoes. These items go out of fashion and are replaced. Shall I 301 a Url that has a discontinued product or shall we remove from Googles index using webmaster tools. I seem to have a massive 301 list that carries on growing and Im concerned that to carry on doing 301s is not the right way.
Reporting & Analytics | | weddingshoesandaccessories0 -
Uptick in not tracked conversions / anyone have a list of things that google analytics will not track
There seems to have been an uptick in users on our site not being tracked in Google Analytics cause I see a lot more un-tracked revenue in the last 6 months then I used to. I know analytics is still working as it has been tracking a normal amount of visits but I assumed there might be a reason less would be actually showing up in analytics (mabye a change is what is being reported as organic). I know a lot of stuff goes into "not provided" such as logged in search and stuff like that but is there a list of all of the ones that go into not provided and all that just do not get tracked (javascript not enabled, iOS?). If it could be something else as well let me know. Thanks for the help!
Reporting & Analytics | | Gordian0 -
Totally Remove "localhost" entries from Google Analytics
Hello All, In Google Analytics I see a bunch of traffic coming from "localhost:4444 / referral". I had tried once before to create a filter to exclude this traffic source, but obviously I did it wrong since it's still showing up. Here is the filter I have currently: Filter Name: Exclude localhost
Reporting & Analytics | | Robert-B
Filter Type: Custom filter > Exclude
Filter Field: Referral
Filter Pattern: .localhost:4444.
Case Sensitive: No Can anyone see what I'm doing wrong and give me a push in the right direction? Thanks in advance!0 -
What impact will Google's 10/18/2011 announcement of 'Making Search More Secure' have on the ability to track specific keyword queries via Analytics?
The full announcement is here: http://googleblog.blogspot.com/2011/10/making-search-more-secure.html My concern is that the ability for Google Analytics to parse information on specific keyword queries will be diminished. The article hints that Google Webmaster Tools will be exempt from the problem, and I've never relied on Webmaster tools as a go-to for tying specific keyword queries to Goal Tracking (form submissions and sales). The community's thoughts on this one are appreciated. 🙂
Reporting & Analytics | | MKR_Agency0