Deleting low quality content
-
Hi there. I have a question about deleting low quality content pages hopefully anyone could share your feedback on.
We have a b2c ecom store and Product Pages are our target LDPs from search. We've built many information pages that are related to different products in the long past that are linked to related product pages.
Problem is many of them lack so-called quality content in terms of volume and quality and they aren't helping. Especially since early this year, organic traffic started declining after having peaked in Feb.
So I'm considering deleting those we and Moz consider low quality that are not receiving search traffic.
Firstly, is that a good idea? Secondly, how should I go about it? Just delete them and put a redirect so that deleted pages will point to related pages or even homepage?
Looking forward to any expert input.
-Yuji -
you do need to obtain seo advice, but often, we don't advise to delete the page but to improve it substantially.
If you have duplicated content, remove it and replace it with well-written, white-hat, high-quality content marketing. This is how we've improved many businesses' local seo by improving on-page SEO, rather than deleting it completely.
-
It would be best to talk to an[SEO Agency to get advice before you delete any blog posts or main pages.
-
Thanks for your advice. Yes, we will definitely be careful deleting pages. Thanks a lot!
-
That's a really good idea! Cut down what you have to manage to the essentials and then spend more time on those pages. Make sure you do some kind of ranking or traffic audit against all the pages though. You don't want to delete the versions of each page which have some (even if it is small) SEO power. You want to target the ones which Google isn't using
-
Thanks a lot for your feedback. It was helpful. I think we may need to remove pages leaving only unique ones and update their content to be more valuable. Thanks!
-
This is usually speaking **not the right mind set **to succeed.
When Google says (through decreasing ranking positions) that you haven't put in enough effort, usually deleting a poor attempt garners no favour in the ranking results. Think about it. Google are saying "you don't have enough quality content" and your answer is to delete content, thus having less than before. Does that seem like a genuine attempt to comply with the increasing stringency of Google's guidelines?
Deleting stuff is the easy way out. Think about it as if you wrote an essay in College and Google were the examiner. They Give you a D- for your essay and mark certain areas of your work as needing improvement. If you deleted those paragraphs, did nothing else and re-submitted the essay would you honestly expect a better grade?
Google want to see effort, unique content, value-add for end users. _Real _hard graft.
If you have high volumes of pages which are identical other than one tiny tab of information or a variable price, then maybe streamlining your architecture by removing pages is the answer. If most of the pages are unique in function (e.g: factually different products, not just parameter-based URL variants etc) then it's more a comment on the lack of invested effort and you must tackle your mindset if you want to rank.
N.B: By effort I don't mean your personal effort. I could also be alluding to the fact that budget was too low when producing content. I'm describing the site - not you personally!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Which URL should I choose when combining content?
I am combining content from two similar articles into one. URL 1 has a featured snippet and better URL structure, but only 5,000 page views in the last 6 month, and has 39 keywords ranking in the top 10. URL 2 has worse structure, but over 100k page views in the last 6 months, and 236 keywords in the top 10. Basically, I'm wondering if I keep the one with the better URL structure or the one with more traffic. The deleted URL will be redirected to whichever I keep.
Intermediate & Advanced SEO | | curtis-yakketyyak0 -
Weight of content further down a page
Hi, A client is trying to justify a design decision by saying he needs all the links for all his sub pages on the top level category page as google won't index them; however the links are available on the sub category and the sub category is linked to from the top level page so I have argued as long as google can crawl the links through the pages they will be indexed and won't be penalised. Am I correct? Additionally the client has said those links need to be towards the top of the page as content further down the page carries less weight; I don't believe this is the case but can you confirm? Thanks again, Craig.
Intermediate & Advanced SEO | | CSIMedia1 -
2 URLS pointing to the same content
Hi, We currently have 2 URL's pointing to the same website (long story why we have it) - A & B. A is our main website but we set up B as a rewrite URL to use for our Pay Per Click campaign. Now because its the same site, but B is just a URL rewrite, Google Webmaster Tools is seeing that we have thousands of links coming in from site B to site A. I want to tell Google to ignore site B url but worried it might affect site A. I can't add a no follow link on site B as its the same content so will also be applicable on Site A. I'm also worried about using Google Disavow as it might impact on site A! Can anyone make any suggestions on what to do, as I would like to hear from anyone with experience with this or can recommend a safe option. Thanks for your time!
Intermediate & Advanced SEO | | Party_Experts0 -
Blocking poor quality content areas with robots.txt
I found an interesting discussion on seoroundtable where Barry Schwartz and others were discussing using robots.txt to block low quality content areas affected by Panda. http://www.seroundtable.com/google-farmer-advice-13090.html The article is a bit dated. I was wondering what current opinions are on this. We have some dynamically generated content pages which we tried to improve after panda. Resources have been limited and alas, they are still there. Until we can officially remove them I thought it may be a good idea to just block the entire directory. I would also remove them from my sitemaps and resubmit. There are links coming in but I could redirect the important ones (was going to do that anyway). Thoughts?
Intermediate & Advanced SEO | | Eric_edvisors0 -
Where would you advertise for a quality Guest Poster
I'm looking to find a high quality writer who understands SEO and the art of Guest Blogging/Posting on really good quality sites. Sites like Life Hacker etc... My Question is... Where would you advertise this job? I have tried People Per Hour in the past and never really found anyone and I find My Blog Guest (Guest Blog - Whatever it's called) to only really have low quality sites on there. What would you do? Cheers
Intermediate & Advanced SEO | | JohnW-UK0 -
Pages with Little Content
I have a website that lists events in Dublin, Ireland. I want to provide a comprehensive number of listings but there are not enough hours in the day to provide a detailed (or even short) unique description for every event. At the moment I have some pages with little detail other than the event title and venue. Should I try and prevent Google from crawling/indexing these pages for fear of reducing the overall ranking of the site? At the moment I only link to these pages via the RSS feed. I could remove the pages entirely from my feed, but then that mean I remove information that might be useful to people following the events feed. Here is an example page with very little content
Intermediate & Advanced SEO | | andywozhere0 -
Having a hard time with duplicate page content
I'm having a hard time redirecting website.com/ to website.com The crawl report shows both versions as duplicate content. Here is my htaccess: RewriteEngine On
Intermediate & Advanced SEO | | cgman
RewriteBase /
#Rewrite bare to www
RewriteCond %{HTTP_HOST} ^mywebsite.com
RewriteRule ^(([^/]+/)*)index.php$ http://www.mywebsite.com/$1 [R=301,L] RewriteCond %{REQUEST_FILENAME} !-d
RewriteCond %{REQUEST_FILENAME}.php -f
RewriteRule ^(.*)$ $1.php [NC,L]
RewriteCond %{HTTP_HOST} !^.localhost$ [NC]
RewriteRule ^(.+)/$ http://%{HTTP_HOST}$1 [R=301,L] I added the last 2 lines after seeing a Q&A here, but I don't think it has helped.0 -
Google indexing flash content
Hi Would googles indexing of flash content count towards page content? for example I have over 7000 flash files, with 1 unique flash file per page followed by a short 2 paragraph snippet, would google count the flash as content towards the overall page? Because at the moment I've x-tagged the roberts with noindex, nofollow and no archive to prevent them from appearing in the search engines. I'm just wondering if the google bot visits and accesses the flash file it'll get the x-tag noindex, nofollow and then stop processing. I think this may be why the panda update also had an effect. thanks
Intermediate & Advanced SEO | | Flapjack0