Large number of thin content pages indexed, affect overall site performance?
-
Hello Community,
Question on negative impact of many virtually identical calendar pages indexed.
We have a site that is a b2b software product. There are about 150 product-related pages, and another 1,200 or so short articles on industry related topics. In addition, we recently (~4 months ago) had Google index a large number of calendar pages used for webinar schedules. This boosted the indexed pages number shown in Webmaster tools to about 54,000.
Since then, we "no-followed" the links on the calendar pages that allow you to view future months, and added "no-index" meta tags to all future month pages (beyond 6 months out). Our number of pages indexed value seems to be dropping, and is now down to 26,000.
When you look at Google's report showing pages appearing in response to search queries, a more normal 890 pages appear. Very few calendar pages show up in this report.
So, the question that has been raised is: Does a large number of pages in a search index with very thin content (basically blank calendar months) hurt the overall site? One person at the company said that because Panda/Penguin targeted thin-content sites that these pages would cause the performance of this site to drop as well.
Thanks for your feedback.
Chris
-
Unless a page can give value to a searcher (not just an existing customer) it shouldn't be in Google's search index.
Sometimes I like to go back to the basics. Remember that search engines exist to help people find information that they WANT to find. Realistically, people are not going to want to find every page on your websites in SERPS.I suggest you ask yourself this question; does this page offer information that someone would actually want to search for, and make your decision accordingly.
p.s. Having said all of that, I'll answer your question. The answer is yes, having thin pages on your site can hurt your domain. If your pages offer value to searchers, I suggest you improve them instead of remove them, but if they don't offer value to searchers don't waste your time, and just no-index them.
-
So, the question that has been raised is: Does a large number of pages in a search index with very thin content (basically blank calendar months) hurt the overall site?
Yes.
We had a site with some image content pages that had not a lot of text. They ranked great for years. Then, BAM, rankings across the site dropped on a Panda update.
We added noindex/follow to these pages, redirected some that were obsolete and our rankings came back with the next update.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Search results performance affected by similarly named Adult business?!
Hey This is an unusual one I guess and one we've scratched our heads on for ages without reaching any definitive answer, so would be very grateful to the Moz community for some thought and guidance! Client website: https://www.themassagerooms.com This is a professionally run, therapeutic health business offering on-demand ("mobile", ie visiting customers at their homes) massage service. Importantly, please note again (you'll see why in a minute) the exact URL and the fact that this business, our client, is a registered therapeutic health and wellness business (ie it is genuine, real, massage services). The business has been around for about 10 years and used to rank very highly for many dream keywords for their industry. However, several years ago they got approached by a domain reseller offer to sell them "massagerooms.com" (ie the same name but without the "The" at the beginning) for a few thousand pounds. They rejected the offer. Interesting Aside: This happened a short while before the Facebook movie was launched ... if they'd seen that movie perhaps they would have accepted! (Facebook was originally called The Facebook but then one of the key investors advised them to drop the "The"! Anyway, unfortunately for them, that offered domain name (massagerooms.com) was then sold to an online adult video services company. Soon after, themassagerooms.com rankings started to suffer. Today, TheMassageRooms.com have a technically very clean site (great scores on Google LightSpeed etc), with regularly updated relevant health and wellness content. They are doing ok in terms of rankings but no where near as well as many of their competitors who on the face of it seem to have significantly worse on and off-page scores as well as many spammy links. Also, TheMassageRooms.com have a much better Moz DA then those competitors that are ranking better. The big question is whether the existence of an adult services website, MassageRooms.com with such a similar name is causing them issues in search results? Especially since many people (regular customers and even their own staff), do search for TheMassageRooms (ie the therapeutic health and wellness company) by only typing "massage rooms". So, there is a clear argument for saying "The Massage Rooms" = "Massage Rooms" in many respects, even through the two URLs which match these exact terms lead to very different businesses. Of course, one solution, might be to change the URL and 301 redirect everything. But would that actually make a difference if the actual issue is that Google's algorithm is somehow connected "MassageRooms.com" (adult site) with "TheMassageRooms.com" (our client's health and wellness site). Also it seems a bit drastic to ask them to change a 10 year established brand name etc.
Algorithm Updates | | AmerTMR0 -
Link reclamation: What happens when backlinks are pointing to other page than the most related page? Any risks?
Hi all, We have started link reclamation process as we failed to redirect our old website links to newly created pages. Unfortunately most of the backlinks are pointing to a page which already has lots of backlinks. Just wondering if I can redirect the old pages to the other pages than the actual related page they must be pointing to make sure only one page doesn't take away all the backlinks. And what happens if Google find that backlink is pointing to a different page than the actual page? Thanks
Algorithm Updates | | vtmoz0 -
Sub-directory pages must be optimised well?
Hi all, We have help pages as sub-directory which have been linked from our website pages (3 clicks depth). But these pages are not well optimised with minor issues like header tags, image alts, etc...Moreover some of these pages are dead-end pages. Will these things hurt us? Thanks
Algorithm Updates | | vtmoz0 -
The evolution of Google's 'Quality' filters - Do thin product pages still need noindex?
I'm hoping that Mozzers can weigh in with any recent experiences with eCommerce SEO..... I like to assume (perhaps incorrectly) that Google's 'Quality' filters (formerly known as Panda) have evolved with some intelligence since Panda first launched and started penalising eCommerce sites for having thin product pages. On this basis i'd expect that the filters are now less heavy handed and know that product pages with no or little product description on them are still a quality user experience for people who want to buy that product. Therefore my question is this...
Algorithm Updates | | QubaSEO
Do thin product pages still need noindex given that more often that not they are a quality search result for those using a product specific search query? Has anyone experienced penalty recently (last 12 months) on an ecommerce site because of a high number of thin product pages?0 -
Website dropping from page 1google uk
Hi all, Firstly let me stress I am not really SEO minded, I know the very basics and that is about it. I am a driving instructor in the UK and have had my website (Wordpress) on page 1 for about 3 years now round the position 3 mark but for the last few months it has been dropping and is now right at the bottom of page 1 so no doubt a few more days and it will vanish completely from page 1 of google.co.uk . I was wondering if someone could just have a quick look at the page to see if they can see anything obvious that wouldn't be seen by me! The search term is " Driving Lessons Worcester " and the page that has always show on page 1 of google is .. http://www.passlee.com/driving-lessons-worcester.html I also had another site on page 1 for about 2 years which was www.drivinglessonsworcester.com this has also vanished to page 5 over the last 3 to 4 months. What really hurts is I made a website for another local instructor using WP and with similar setup to mine and that is now showing as No.1 on page 1 of google.co.uk!! So how is it the website I did for him is doing amazing yet mine is dieing a death when they are setup the same way but obviously different content! As I said I barely know the basics , I am sure a lot of you are thinking " Just go research " which I know I should and no doubt will, but I just wanted someone to have a very quick look to see if there was anything obvious! Kind Regards Lee Francis
Algorithm Updates | | germinus0 -
Google local page
I was in google places and then set up a goole local account 9 months ago but it is not showing as listed. What am I doing wrong or what can I do about it?? https://plus.google.com/u/0/106415232585067786390/about thanks Jo Ann
Algorithm Updates | | jojojo0 -
Should I block non-informative pages from Google's index?
Our site has about 1000 pages indexed, and the vast majority of them are not useful, and/or contain little content. Some of these are: -Galleries
Algorithm Updates | | UnderRugSwept
-Pages of images with no text except for navigation
-Popup windows that contain further information about something but contain no navigation, and sometimes only a couple sentences My question is whether or not I should put a noindex in the meta tags. I think it would be good because the ratio of quality to low quality pages right now is not good at all. I am apprehensive because if I'm blocking more than half my site from Google, won't Google see that as a suspicious or bad practice?1 -
Is this the best way to get rid of low quality content?
Hi there, after getting hit by the Panda bear (30% loss in traffic) I've been researching ways to get rid of low quality content. From what I could find the best advise seemed to be a recommendation to use google analytics to find your worst performing pages (go to traffic sources - google organic - view by landing page). Any page that hasn't been viewed more than 100 times in 18 months should be a candidate for a deletion. Out of over 5000 pages and using this report we identified over 3000 low quality pages which I've begun exporting to excel for further examination. However, starting with the worst pages (according to analytics) I'm noticing some of our most popular pages are showing up here. For example: /countries/Panama is showing up as zero views but the correct version (with the end slash) countries/Panama/ is showing up as having over 600 views. I'm not sure how google even found the former version of the link but I'm even less sure how to proceed now (the webmaster was going to put a no-follow on any crap pages but this is now making him nervous about the whole process). Some advise on how to proceed from here would be fantastico and danke <colgroup><col width="493"></colgroup>
Algorithm Updates | | BrianYork-AIM0