Smaller Index
-
Hi guys,
We are a price comparison website with thousands of webpages. Most of them are product webpages with not so good quality content. Only price information and product image, no product details nor costumers reviews.
We are planing to focus on less product categories by adding reviews, details, better images etc... and I would like to know if I should maintain the other "not-so-good" products in other categories or if I should remove it from index to leverage domain average content quality.
Our index size is 200k pages and we are planning to focus on 10k pages max.
Thanks for your help.
-
Hello Pedro,
I think you are making a very wise decision. If you have already been throttled by Panda this could be what you need to bring the site out of it. If not, this could be what you need to save you from a future update. In fact, Matt Cutts recently answered a question about this sort of thing:
http://www.youtube.com/watch?v=adocBLGQoYENote: The question is about "no results" pages but he discusses similar scenarios as well.
These sort of "stub pages" have been a thorn in Google's side for many years, and rest assured they will continue to find ways of keeping them out of the index - including punishing the good content on sites that use them.
As Infant Raj mentioned below, be sure the URLs return a 404 status code in the http header, which will ensure more prompt removal from the index than if they were to redirect or show a 200 status code. I'd ignore the first paragraph in his answer though.
-
If those less significant pages arent entry pages for organic or referral traffic you can remove it. Else its not a good idea to remove those pages just to reduce the number of indexed pages.
If those pages are removed, make sure you add a custom 404 page to handle the 404 errors
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
If my blog is on Wordpress, and I've installed the AMP plug-in, what do I need to do to get Google to start indexing all my posts as AMP pages?
If I add /amp to the end of any of my posts, I can see that the plug-in is working. It's been months since I installed it, though, and Google hasn't indexed any of the AMP pages. Am I missing a step?
Content Development | | DeanRamadan0 -
Sitemap - 200 out of 2100 pages indexed
I submitted the .xml sitemap in Google Webmaster Tools and only 200 out of 2100 pages were indexed.
Content Development | | Madlena
Why is that and what can I do ?0 -
One story stands out for not getting indexed?
We have all our stories published today ( 20-Jun-2013 ) got indexed by google except this ( http://coed.com/2013/06/20/heres-a-video-of-kate-upton-topless-on-a-horse/ ). Do anyone out there have any clue about that? Thanks in advance
Content Development | | COEDMediaGroup0 -
In my website all the pages are not indexed by google..what to do for the same
In my website http://www.dubins.ae, all the pages are not indexed by google. How to make sure that all the pages are indexed by google?
Content Development | | Muna0 -
In Index but not in Serps
Hi, I have a situation with a client site which is quite frustrating. Basically, most "recent" (by that I mean for the last couple of months) blog posts are failing to reach the SERPS (actually, one has and a couple have from the early days but it's taken months for them to arrive). Previously the blog posts were indexed very quickly - often instantly. Now, I've checked WMT etc and I've submitted each post manually but still nothing. The Sitemap is valid etc. However, pages (not blog posts) seem to be getting into the serps very quickly. Another complication is that if I search: site:www.domainname.com and set the date filter to a month I can see some of the earlier blog posts in that result set. However, if I scrape a bit of unique content from one of those posts and search - nothing in the SERPS. And my Moz report tells me that the page is not to be found in the top 50 either (so I'm confident these pages are not in the SERPS). Any ideas why this would happen to just blog posts? Is it something to do with the parent blog landing perhaps being too strong in the rankings? Any ideas appreciated. Thanks.
Content Development | | KMUK0 -
Indexing of PDF files
Hey all, I understand the functionality of PDF files being indexed and how to remove them if required so in this post I'm not requiring any advice on 'how to' as such, but i just wanted to get a general opinion/consensus of if you deliberately allow PDF files to be crawled/indexed.
Content Development | | Daylan
Whether or not you guys optimise the files for search.
If you do disallow them from being crawled and indexed, why?
Generally the pro's and con's you may have found about have searchable PDF files as part of your indexed content.1 -
Should I Have No Index, No Follow On Blog Category & Tag Pages?
At some point in the past I read or was told that No Index, No Follow tags on category and tag pages were a good thing on a standard WordPress blog in order to prevent duplicate content issues. Is this still true or was it ever true?
Content Development | | eTundra0 -
Please help me stop google indexing https pages on my wordpress site
I added SSL to my wordpress blog because that was the only way to get a dedicated IP address for my site at my host. Now I am noticing Google has started indexing posts both as http and https. Can some one please help how to force google not to index https as I am sure its like having duplicate content. All help is appreciated. So far I have added this to top of htaccess file: RewriteEngine on Options +FollowSymlinks RewriteCond %{SERVER_PORT} ^443$ RewriteRule ^robots.txt$ robots_ssl.txt And added robots_ssl.txt with following: User-agent: Googlebot Disallow: / User-agent: * Disallow: / But https pages are still being indexed. Please help.
Content Development | | rookie1230