Removing blog posts with little/thin content
-
We've got quite a lot (I'd say 75%) of our blog posts which I'd consider as low quality. Short content (500 words or less) with few shares, no backlinks & comments; most of which gets 0-2 unique views a day (however combined this adds up).
Will removing these pages provide an SEO benefit greater than the reduction in traffic from the removal of these pages?
I've heard the likes of Neil Patel/Brian Dean suggest so, however I'm scared it will provide the opposite as less content is indexed and I'll actually see traffic fall.
Sam
-
Sam,
If you can safely assume that the pages are not hurting you, let them stay. It's certainly not ideal to have a website loaded with thin content. But, as is the case with most small sites, the posts are likely to do you more good than harm, provided you're willing to show them some attention.
Here's a good strategy to deploy:
-
Find the top 10 posts, as judged by analyzing GA and against the topics you hope to rank for, then beef them up with additional text and graphics.
-
Republish the posts, listing them as "updated."
-
Share the posts via social, using a meaningful quote from each piece to draw interest and invite re-shares.
-
Continue sharing the posts in the following weeks, each time with new text.
-
Gauge the performance of each social share, then use this information to create additional headlines for new posts, in addition to using it to inform you of what content might draw the most interest.
-
Repeat the process with the next 10 posts.
When you have thin, poorly performing content on your site, you aren't able to learn enough about what you're doing right to make a sound call. So to create more content, even "better" content, is likely a mistake. The wise approach is to use the content you have to investigate additional content ideas that would better serve your audience. Through social media and additional traffic to your site, you should be able to better discern what pieces of content will provide the greatest benefit in the future.
Additionally, the old content is likely to perform much better as well.
RS
-
-
It's difficult to talk in terms of truevalue. Someone of them may provide some value, but they pale in comparison to the new blog posts we have lined up and in my opinion bring the blog down; personally I wouldn't be sad to see them go.
I think it's time to exterminate.
Sam
-
Do the contents of these blog posts provide any value at all to the reader? Are they written well, and would you actually be sad to see them go? If yes, then refer to my previous response on re-purposing them to create even better content with more SEO value.
If not, and you're just worried about SEO, I'd say be rid of them. Based on those stats.
-
Thanks all, from my analysis:
In the last twelve months:
376 pages (although I'd estimate 70 of these aren't pages)
104 pages have bounce rate of 100%
307 pages have less than 20 unique views (for the previous 12 months) but the total count for this would be 1,374
which is a sizable sum.So the question is, is it worth pulling all the pages below 20 unique views and all the 100% bounce rate pages from the site? Will it actually benefit our SEO or am I just making work for myself?
I'd love to hear from people who've actually seen positive SEO movements after removing thin pages.
-
It's a waste of good content to remove it because it's considered "thin". In your position, I would consider grouping these under-performing/thin blog posts into topical themes, compile and update them to create "epic content" in the form of detailed guides or whatever is most suitable to the content. Add to the long post so that there's some logical structure to the combining of the little posts (and so it doesn't just read as if you stuck multiple posts together), then redirect the old post URLs to the newly created relevant posts. Not only do you have fresh content that could each provide a ton of value to your readers, but the SEO value of these so-called "epic" posts should in theory be more impactful.
Good luck, whatever you decide to do!
-
My rule of thumb would be:
Take all pages offline, which have under 30 organic sessions per month.
Like Dmitrii already mentioned, check your past data for these posts and have a look at average sessions durations / bounce rates / pages per sessions, with which you can valdiate the "quality of the traffic". If there are posts which have decent stats - don't take them offline. Rather update them or write a new blog post about the topic and make a redirect. In this case have a look in GWT for the actual serach queries (maybe you find some useful new insights).
-
Hi there.
Are those blogs ranking anywhat for any related keyphrases? At the same time, how about bounce rate and time on page for those 2 visits a day? Are you sure those visits are not bots/crawlers?
We have done similar reduction about 6 months ago and we haven't seen any drop in rankings. The share of traffic to thin pages was pretty small and bounce rate was high, as well as time on page was very short. So, why to have anything which doesn't do any good?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Search Console Crawl Errors/Not Found - Strange URLs
Hello, In Google Search Console under Crawl > Crawl Errors > Not found I have strange URLs like the following: https://www.domain.com//UbaOZ/
Reporting & Analytics | | chuck-layton
https://www.domain.com//UPhXZ/
https://www.domain.com//KaUpZ/WYdhZ/SnQZZ/MOcUZ/ There is no info in Linked From tab. Have you seen this type of error??
Does anyone know whats causing it??
How should it be fixed?? Thanks for reading and the help!0 -
Cross-domain / subdomain tracking in GA?
Hi there, My client has a site website.com and a booking engine, booking.website.com They are currently tracking the main site and the booking subdomain as two separate properties in the same GA account. The issue is we can't see where users are originating on the subdomain property; it's all being counted as referral. My understanding is we need to set up subdomain tracking using Google Tag Manager in order for GA to pass the user data between the two subdomains. This is fine, except for this one line I am reading on Google's guide to cross-domain tracking: Subdomains If you have updated your tracking code to analytics.js, then no additional configuration is required to track subdomains. You can use cross domain tracking to collect data from a primary domain, like www.example.com, and a subdomain, like www.subdomain.example.com, in a single Analytics account property. That last line makes it sound like we should be using cross-domain tracking for this purpose. Are we correct in setting up subdomain tracking and NOT cross-domain tracking to be able to track users across subdomains on the same domain?
Reporting & Analytics | | FPD_NYC0 -
Backlinks Tracking Websites/Tools/Software
I have multiple websites that I need to keep track of their backlinks. How do you guys keep track of your backlinks? What are some cool tools that you use ?
Reporting & Analytics | | AngelosS0 -
Google Analytics: Dashboard to show popular content per directory
Hello, I work for a furniture business and I would like to set up a dashboard in Google Analytics to show a table for each of the 10 sections to show the most popular content, ie. /Sofas
Reporting & Analytics | | Bee159
/Sofas/black-leather-sofa | 987 PVs
/Sofas/brown-leather-sofa | 782 PVs
/Sofas/classic-material-sofa | 636 PVs
etc. /Beds
/Beds/king-size-bed | 900 PVs
etc How would I go about doing this? Thank you0 -
How can I remove parameters from the GSC URL blocking tool?
Hello Mozzers My client's previous SEO company went ahead and blindly blocked a number of parameters using the GSC URL blocking tool. This has now caused Google to stop crawling many pages on my client's website and I am not sure how to remove these blocked parameters so that they can be crawled and reindexed by Google. The crawl setting is set to "Let Google bot decide" but still there has been a drop in the number of pages being crawled. Can someone please share their experience and help me delete these blocked parameters from GSC's URL blocking tool. Thank you Mozzers!
Reporting & Analytics | | Vsood0 -
How does Google Maps/G+ traffic show up in Analytics?
Hi Moz Community, I've been trying to figure out how traffic from Google Maps (and G+) shows up in Google Analytics and am struggling to find a good answer online. If someone finds a business through Google Maps and then clicks on the website in the Maps listing, does that show up as a referral from Google Maps? Our site shows virtually zero traffic from Google Maps even though we have a number of listing. Two related questions: if someone clicks through to a G+ page from a Maps result and then visits our website from the G+ page, does that show up in Analytics as a referral from G+? Is traffic from Google Maps or G+ ALSO counted as organic traffic? (Would it be possible to accidentally double-count a visit as both organic and a referral from Maps/G+? Thanks everybody!
Reporting & Analytics | | JohnGroves0 -
URL Structure Q - /UniqueURL/ProductA or /SubcategoryURL/ProductA?
Hi Mozers, I have a niche ecommerce site http://www.ecustomfinishes.com that sells custom barn wood furniture. I have about 600 products online. 2 weeks ago I started rewriting my urls from /subcategoryurl/ProductA to /UNIQUEURL/productA for my individual products, For example for my subcategory farm tables (150 products) I had /rustic-farm-tables/productA, /rustic-farm-tables/ProductB ...."rustic-farm-table" about 150 times. 2 weeks ago I started changing the 150x "/rustic-farm-table/" to a more descriptive URL such as /white-farm-table/producA /rustic-square-dining-table/ProductB /Black-harvest-table/ProductC Here is why I am need advice: I have 1181 pages, the page with the most entrances with "rustic-farm-tables" is #31/1181 based on entrances. the 2nd most is #71/1181 Alternatively, I have 13 table product pages such a as /12ft-Rustic-Farm-Dining-Table-p/12-foot-table-with-inlay.htm" that get more entrances than any product that includes "rustic-farm-tables" Since changing the urls to be product specific, my overall traffic has dropped 20%!!! So here is my question: do i continue to have the /UNIQUEURL/product be unique to the product, which is consistant amongst my best preforming pages, yet has dropped my traffic 20% in the last 2 weeks, OR do i keep /SAME-URL/product which written as a best practice, and be happy with the traffic I had? Could the 20% drop just be a temporary shock? Why would this happen? This would be a good long tail/head term experiment. Try to get more head terms, or do what you can do focus on long tail. I hope i was able to explain this well, I say follow the best practices of my best preforming pages, however the 20% drop has me worried. Thank you in advance for your help
Reporting & Analytics | | longdenc_gmail.com0 -
Google Analytics Goal Tracking Head Match w/ Query Strings
Hello, I have what should be a simple question here but there is a small nuisance I am trying to make sure I have configured correctly. We have a product based website w/ no e-commerce because they sell through a dealer network. All these product pages have "Where to Buy" links and the URL after you click where to buy always uses the query string ?r=XXX. Example: www.mysite.com/product/category/subcategory/product-name?r=12345 I want to setup a goal in GA with a URL and configure head match on the "?r" but which of the following is exactly how it should be configured with the "Goal URL" ?r= ?r r= Does it matter, because I had it setup as "?r" and it was never registering any goals. Do I need to leave off the "?" and just have it be r= Thanks in advance for the respones.
Reporting & Analytics | | Bevelwise0