Deleting low quality content
-
Hi there. I have a question about deleting low quality content pages hopefully anyone could share your feedback on.
We have a b2c ecom store and Product Pages are our target LDPs from search. We've built many information pages that are related to different products in the long past that are linked to related product pages.
Problem is many of them lack so-called quality content in terms of volume and quality and they aren't helping. Especially since early this year, organic traffic started declining after having peaked in Feb.
So I'm considering deleting those we and Moz consider low quality that are not receiving search traffic.
Firstly, is that a good idea? Secondly, how should I go about it? Just delete them and put a redirect so that deleted pages will point to related pages or even homepage?
Looking forward to any expert input.
-Yuji -
you do need to obtain seo advice, but often, we don't advise to delete the page but to improve it substantially.
If you have duplicated content, remove it and replace it with well-written, white-hat, high-quality content marketing. This is how we've improved many businesses' local seo by improving on-page SEO, rather than deleting it completely.
-
It would be best to talk to an[SEO Agency to get advice before you delete any blog posts or main pages.
-
Thanks for your advice. Yes, we will definitely be careful deleting pages. Thanks a lot!
-
That's a really good idea! Cut down what you have to manage to the essentials and then spend more time on those pages. Make sure you do some kind of ranking or traffic audit against all the pages though. You don't want to delete the versions of each page which have some (even if it is small) SEO power. You want to target the ones which Google isn't using
-
Thanks a lot for your feedback. It was helpful. I think we may need to remove pages leaving only unique ones and update their content to be more valuable. Thanks!
-
This is usually speaking **not the right mind set **to succeed.
When Google says (through decreasing ranking positions) that you haven't put in enough effort, usually deleting a poor attempt garners no favour in the ranking results. Think about it. Google are saying "you don't have enough quality content" and your answer is to delete content, thus having less than before. Does that seem like a genuine attempt to comply with the increasing stringency of Google's guidelines?
Deleting stuff is the easy way out. Think about it as if you wrote an essay in College and Google were the examiner. They Give you a D- for your essay and mark certain areas of your work as needing improvement. If you deleted those paragraphs, did nothing else and re-submitted the essay would you honestly expect a better grade?
Google want to see effort, unique content, value-add for end users. _Real _hard graft.
If you have high volumes of pages which are identical other than one tiny tab of information or a variable price, then maybe streamlining your architecture by removing pages is the answer. If most of the pages are unique in function (e.g: factually different products, not just parameter-based URL variants etc) then it's more a comment on the lack of invested effort and you must tackle your mindset if you want to rank.
N.B: By effort I don't mean your personal effort. I could also be alluding to the fact that budget was too low when producing content. I'm describing the site - not you personally!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Big retailers and duplicate content
Hello there! I was wondering if you guys have experience with big retailers sites fetching data via API (PDP content etc.) from another domain which is also sharing the same data with other multiple sites. If each retailer has thousands on products, optimizing PDP content (even in batches) is quite of a cumbersome task and rel="canonical" pointing to original domain will dilute the value. How would you approach this type of scenario? Looking forward to read your suggestions/experiences Thanks a lot! Best Sara
Intermediate & Advanced SEO | | SaraCoppola1 -
Duplicate Page Content
We have different plans that you can signup for - how can we rectify the duplicate page content and title issue here? Thanks. | http://signup.directiq.com/?plan=100 | 0 | 1 | 32 | 1 | 200 |
Intermediate & Advanced SEO | | directiq
| http://signup.directiq.com/?plan=104 | 0 | 1 | 32 | 1 | 200 |
| http://signup.directiq.com/?plan=116 | 0 | 1 | 32 | 1 | 200 |
| http://signup.directiq.com/?plan=117 | 0 | 1 | 32 | 1 | 200 |
| http://signup.directiq.com/?plan=102 | 0 | 1 | 32 | 1 | 200 |
| http://signup.directiq.com/?plan=119 | 0 | 1 | 32 | 1 | 200 |
| http://signup.directiq.com/?plan=101 | 0 | 1 | 32 | 1 | 200 |
| http://signup.directiq.com/?plan=103 | 0 | 1 | 32 | 1 | 200 |
| http://signup.directiq.com/?plan=5 |0 -
Big problem with duplicate page content
Hello! I am a beginner SEO specialist and a have a problem with duplicate pages content. The site I'm working on is an online shop made with Prestashop. The moz crawl report shows me that I have over 4000 duplicate page content. Two weeks ago I had 1400. The majority of links that show duplicate content looks like bellow:
Intermediate & Advanced SEO | | ana_g
http://www.sitename.com/category-name/filter1
http://www.sitename.com/category-name/filter1/filter2 Firstly, I thought that the filtres don't work. But, when I browse the site and I test it, I see that the filters are working and generate links like bellow:
http://www.sitename.com/category-name#/filter1
http://www.sitename.com/category-name#/filter1/filter2 The links without the # do not work; it messes up with the filters.
Why are the pages indexed without the #, thus generating me duplicate content?
How can I fix the issues?
Thank you very much!0 -
Penalized for Duplicate Page Content?
I have some high priority notices regarding duplicate page content on my website www.3000doorhangers.com Most of the pages listed here are on our sample pages: http://www.3000doorhangers.com/home/door-hanger-pricing/door-hanger-design-samples/ On the left side of our page you can go through the different categories. Most of the category pages have similar text. We mainly just changed the industry on each page. Is this something that google would penalize us for? Should I go through all the pages and use completely unique text for each page? Any suggestions would be helpful Thanks! Andrea
Intermediate & Advanced SEO | | JimDirectMailCoach0 -
About robots.txt for resolve Duplicate content
I have a trouble with Duplicate content and title, i try to many way to resolve them but because of the web code so i am still in problem. I decide to use robots.txt to block contents that are duplicate. The first Question: How do i use command in robots.txt to block all of URL like this: http://vietnamfoodtour.com/foodcourses/Cooking-School/
Intermediate & Advanced SEO | | magician
http://vietnamfoodtour.com/foodcourses/Cooking-Class/ ....... User-agent: * Disallow: /foodcourses ( Is that right? ) And the parameter URL: h
ttp://vietnamfoodtour.com/?mod=vietnamfood&page=2
http://vietnamfoodtour.com/?mod=vietnamfood&page=3
http://vietnamfoodtour.com/?mod=vietnamfood&page=4 User-agent: * Disallow: /?mod=vietnamfood ( Is that right? i have folder contain module, could i use: disallow:/module/*) The 2nd question is: Which is the priority " robots.txt" or " meta robot"? If i use robots.txt to block URL, but in that URL my meta robot is "index, follow"0 -
How should I exclude content?
I have category pages on an e-commerce site that are showing up as duplicate pages. On top of each page are register and login, and when selected they come up as category/login and category/register. I have 3 options to attempt to fix this and was wondering what you think is the best. 1. Use robots.txt to exclude. There are hundreds of categories so it could become large. 2. Use canonical tags. 3. Force Login and Register to go to their own page.
Intermediate & Advanced SEO | | EcommerceSite0 -
Fixing Duplicate Content Errors
SEOMOZ Pro is showing some duplicate content errors and wondered the best way to fix them other than re-writing the content. Should I just remove the pages found or should I set up permanent re-directs through to the home page in case there is any link value or visitors on these duplicate pages? Thanks.
Intermediate & Advanced SEO | | benners0 -
Content that is split into 4 pages, should I consolidate?
I am working on improving a website that has each section split into four pages. For example, if Indonesia Vacation was a section, it would have its main page, www.domain.com/indonesia-vacation, and the about, fact sheet, and tips on three other pages www.domain.com/indonesia-vacation-1 www.domain.com/indonesia-vacation-2 www.domain.com/indonesia-vacation-3 The pages share very similar title tags and I am worried it is hurting the main page for placement.. So to conserve link juice, would it make sense to have them all one page? There is not so much content that it would affect load time. My strategy would be to have all content available and part of the main page and 301 the three URL's back to the main page: www.domain.com/indonesia-vacation Any insight would be greatly appreciated!!!
Intermediate & Advanced SEO | | MattAaron0