Removing blog posts with little/thin content
-
We've got quite a lot (I'd say 75%) of our blog posts which I'd consider as low quality. Short content (500 words or less) with few shares, no backlinks & comments; most of which gets 0-2 unique views a day (however combined this adds up).
Will removing these pages provide an SEO benefit greater than the reduction in traffic from the removal of these pages?
I've heard the likes of Neil Patel/Brian Dean suggest so, however I'm scared it will provide the opposite as less content is indexed and I'll actually see traffic fall.
Sam
-
Sam,
If you can safely assume that the pages are not hurting you, let them stay. It's certainly not ideal to have a website loaded with thin content. But, as is the case with most small sites, the posts are likely to do you more good than harm, provided you're willing to show them some attention.
Here's a good strategy to deploy:
-
Find the top 10 posts, as judged by analyzing GA and against the topics you hope to rank for, then beef them up with additional text and graphics.
-
Republish the posts, listing them as "updated."
-
Share the posts via social, using a meaningful quote from each piece to draw interest and invite re-shares.
-
Continue sharing the posts in the following weeks, each time with new text.
-
Gauge the performance of each social share, then use this information to create additional headlines for new posts, in addition to using it to inform you of what content might draw the most interest.
-
Repeat the process with the next 10 posts.
When you have thin, poorly performing content on your site, you aren't able to learn enough about what you're doing right to make a sound call. So to create more content, even "better" content, is likely a mistake. The wise approach is to use the content you have to investigate additional content ideas that would better serve your audience. Through social media and additional traffic to your site, you should be able to better discern what pieces of content will provide the greatest benefit in the future.
Additionally, the old content is likely to perform much better as well.
RS
-
-
It's difficult to talk in terms of truevalue. Someone of them may provide some value, but they pale in comparison to the new blog posts we have lined up and in my opinion bring the blog down; personally I wouldn't be sad to see them go.
I think it's time to exterminate.
Sam
-
Do the contents of these blog posts provide any value at all to the reader? Are they written well, and would you actually be sad to see them go? If yes, then refer to my previous response on re-purposing them to create even better content with more SEO value.
If not, and you're just worried about SEO, I'd say be rid of them. Based on those stats.
-
Thanks all, from my analysis:
In the last twelve months:
376 pages (although I'd estimate 70 of these aren't pages)
104 pages have bounce rate of 100%
307 pages have less than 20 unique views (for the previous 12 months) but the total count for this would be 1,374
which is a sizable sum.So the question is, is it worth pulling all the pages below 20 unique views and all the 100% bounce rate pages from the site? Will it actually benefit our SEO or am I just making work for myself?
I'd love to hear from people who've actually seen positive SEO movements after removing thin pages.
-
It's a waste of good content to remove it because it's considered "thin". In your position, I would consider grouping these under-performing/thin blog posts into topical themes, compile and update them to create "epic content" in the form of detailed guides or whatever is most suitable to the content. Add to the long post so that there's some logical structure to the combining of the little posts (and so it doesn't just read as if you stuck multiple posts together), then redirect the old post URLs to the newly created relevant posts. Not only do you have fresh content that could each provide a ton of value to your readers, but the SEO value of these so-called "epic" posts should in theory be more impactful.
Good luck, whatever you decide to do!
-
My rule of thumb would be:
Take all pages offline, which have under 30 organic sessions per month.
Like Dmitrii already mentioned, check your past data for these posts and have a look at average sessions durations / bounce rates / pages per sessions, with which you can valdiate the "quality of the traffic". If there are posts which have decent stats - don't take them offline. Rather update them or write a new blog post about the topic and make a redirect. In this case have a look in GWT for the actual serach queries (maybe you find some useful new insights).
-
Hi there.
Are those blogs ranking anywhat for any related keyphrases? At the same time, how about bounce rate and time on page for those 2 visits a day? Are you sure those visits are not bots/crawlers?
We have done similar reduction about 6 months ago and we haven't seen any drop in rankings. The share of traffic to thin pages was pretty small and bounce rate was high, as well as time on page was very short. So, why to have anything which doesn't do any good?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Metadata and duplicate content issues
Hi there: I'm seeing a steady decline in organic traffic, but at the same time and increase in pageviews and direct traffic. My site has about 3,000 crawl errors!! Errors are duplicate content, missing description tags, and description too long. Most of these issues are related to events that are being imported from Google calendars via ical and the pages created from these events. Should we block calendar events from being crawled by using the disallow directive in the robots.txt file? Here's the site: https://www.landmarkschool.org/
Reporting & Analytics | | BGR0 -
Community Discussion - Do you think increasing word count helps content rank better?
In the online marketing community, there is a widespread belief that long-form content ranks better. In today's YouMoz post, Ryan Purthill shares how his research indicated 1,125 to be a magic number of sorts: The closer a post got to this word count, the better it ranked. Diminishing returns, however, were seen once a post exceeded 1,125 words. Does this jibe with your own data and experiences? Do you think increasing word count helps content rank better in general? What about for specific industries and types of content? Let's discuss!
Reporting & Analytics | | Christy-Correll6 -
Free Media Site / High Traffic / Low Engagement / Strategies and Questions
Hi, Imagine a site "mediapalooza dot com" where the only thing you do there is view free media. Yet Google Analytics is showing the average view of a media page is about a minute; where the average length of media is 20 - 90 minutes. And imagine that most of this media is "classic" and that it is generally not available elsewhere. Note also that the site ranks terribly in Google, despite having decent Domain Authority (in the high 30's), Page Authority in the mid 40's and a great site and otherwise quite active international user base with page views in the tens of thousands per month. Is it possible that GA is not tracking engagement (time on site) correctly? Even accounting for the imperfect method of GA that measures "next key pressed" as a way to terminate the page as a way to measure time on page, our stats are truly abysmal, in the tenths of a percentage point of time measured when compared with actual time we think the pages are being used. If so, will getting engagement tracking to more accurately measure time on specif pages and site signal Google that this site is actually more important than current ranking indicates? There's lots of discussion about "dwell time" as this relates to ranking, and I'm postulating that if we can show Google that we have extremely good engagement instead of the super low stats that we are reporting now, then we might get a boost in ranking. Am I crazy? Has anyone got any data that proves or disproves this theory? as I write this out, I detect many issues - let's have a discussion on what else might be happening here. We already know that low engagement = low ranking. Will fixing GA to show true engagement have any noticeable impact on ranking? Can't wait to see what the MOZZERS think of this!
Reporting & Analytics | | seo_plus0 -
Gets traffic on both domain.dk and domain.dk/default.asp
Hi. Im runnning a couple of sites. And in my analytics/webmastertool I get both traffic on domain.dk and domain.dk/default.asp which are both essentially the same page. I'm pretty sure it would be better, if I somehow could make the default.asp "redirect" to "/". I dont wanna loose the linkjuice thou. Any smart suggestions for an easy fix? /Kasper
Reporting & Analytics | | KasperGJ0 -
Fresh Content Still As important?
We have an internal debate, that perhaps y'all can help us resolve. In the past "freshness" of content has been important, correct? (Google's QDF for example) In the past (to present) when we build a site with the intent to SEO the site, we build the core pages with the expectation that we will be adding more site pages as the project progresses, thus settling the "fresh content" factor. But it has been proposed to us, from a client, that completely building the site out with all the pages you hope to rank, getting the upfront bang for your buck. The expectation is that the traffic soars right-off. Now the client says that he has been doing this for years and has not been affected by any alog changes. (although we have not seen proof of this from him) So our question is this: Is it better to provide a website full of fresh content at the beginning of the project, for a jumpstart on traffic, then leave the site alone ( for the most part) or Is it better to have core pages of fresh content at the start, and build out new pages from their, so the website remains fresh every month? And can you prove your argument? (we need cold hard facts to be convinced 🙂
Reporting & Analytics | | Britewave0 -
301 Redirect 'https'? First post - Newbie.
Good afternoon, Thank you in advance for your help - this is my first post and I am new to all of this. Situation: I've setup 301 redirects for www.thechiplab.com to my new site www.chiplab.com (recently launched e-commerce site on Magento) through cPanel. Problem: Some of my best links are to my old ''https:" www.thechiplab.com secure domain (ex. http://techcrunch.com/2006/12/22/why-doesnt-cafepress-use-flash/) and are not being "passed" on to the new domain. (Open Site Explorer) Is it possible to recover any of the PR from the old secure site? Thanks again, Chase
Reporting & Analytics | | chiplab0 -
Solving link and duplicate content errors created by Wordpress blog and tags?
SEOmoz tells me my site's blog (a Wordpress site) has 2 big problems: a few pages with too many links and duplicate content. The problem is that these pages seem legit the way they are, but obviously I need to fix the problem, sooooo... Duplicate content error: error is a result of being able to search the blog by tags. Each blog post has mutliple tags, so the url.com/blog/tag pages occasionally show the same articles. Anyone know of a way to not get penalized for this? Should I exclude these pages from being crawled/sitemapped? Too many links error: SEOmoz tells me my main blog page has too many links (both url.com/blog/ and url.com/blog-2/) - these pages have excerpts of 6 most recent blog posts. I feel like this should not be an error... anyone know of a solution that will keep the site from being penalized by these pages? Thanks!
Reporting & Analytics | | RUNNERagency0 -
How to remove unwanted dynamic parameters from a URL in Google Analytics
Hi, Would really appreciate some help with this. I have been experimenting with RegEx to achieve this but as I’ve never used it before am currently failing miserably. We have conversion pages i need to set goals for that are formatted as below: https://www.domain.co.uk//Application_Form/(S(ewhbqp5cki0mppuzukunkqno))/enterCardDetails.aspx I need to remove the (s(xxx)) section from the URL as rather than one pages i currently have thousands of unique URL's. What’s catching me out is that as it’s not a URL parameter I can’t discount and as half way through can’t just do head matches etc to /entercarddetails Help would be much appreciated. Thanks.
Reporting & Analytics | | Sarbs0