No Index thousands of thin content pages?
-
Hello all!
I'm working on a site that features a service marketed to community leaders that allows the citizens of that community log 311 type issues such as potholes, broken streetlights, etc. The "marketing" front of the site is 10-12 pages of content to be optimized for the community leader searchers however, as you can imagine there are thousands and thousands of pages of one or two line complaints such as, "There is a pothole on Main St. and 3rd."
These complaint pages are not about the service, and I'm thinking not helpful to my end goal of gaining awareness of the service through search for the community leaders. Community leaders are searching for "311 request service", not "potholes on main street".
Should all of these "complaint" pages be NOINDEX'd? What if there are a number of quality links pointing to the complaint pages? Do I have to worry about losing Domain Authority if I do NOINDEX them?
Thanks for any input.
Ken
-
Egol,
Thanks for this. I did consider the sub-domain option and I'm going to discuss this as an option with my team.
Ken
-
Stephan,
There is little Organic Search traffic to these pages but there are a number of links pointing to them. One of the benefits of this type of business is that you're associated with local governments so you do get links from .gov sites. Most go to the service home pages but there are some that drive to the individual issue pages.
The grouping by category is something to think about. I'll discuss with the team.
Thanks!
-
I really like Stpehan's idea of "indexed collections of complaints".
-
Hi Ken,
It depends a little on how the complaints are organised within the site structure, what links they have, and what traffic these pages bring in. Unless you think domain authority is a particularly big factor in the competitive space the site operates in, I wouldn't fixate on DA. Questions you do want to answer:
- Crawl the whole site, preferably using the Google Search Console and/or Google Analytics API with Screaming Frog. Do these complaints bring in (useful) traffic? Surely part of what makes the 311 service useful for community managers is that people in their community can easily comment and see the comments of others? Thinking further down the line, if the site is difficult for people in the community to find, will they use it less, and thus will community managers see less value in the service over time? Indirectly, people leaving complaints is probably a good thing for the service; do they usually do this after searching for "potholes on main street"? This is all guesswork on my part, as I haven't seen the site.
- If you do have a lot of traffic to the complaint pages, is it useful traffic? Could you afford to lose it (because that may happen if you noindex)? Remember to bear in mind the second-order effects: if nobody complains any more, the manager doesn't need a 311 service!
- Do you actually have valuable (external) links to the complaints? We can't guess at that—the only solution is to use Open Site Explorer, ahrefs, Majestic, etc...
Without knowing more, I'll just say: there probably isn't value in having an indexed page for each complaint, but there might be value in having indexed collections of complaints, optimised for neighbourhood or street. So if there are 6 complaints about potholes on main street, a first step might be for each individual complaint-page to canonical back to the page detailing all complaints about main street. And if complaints are really that brief (1 or 2 sentences), eventually I'd prefer to change the site structure altogether, so that each complaint didn't get its own page at all, but that I had one page for each neighbourhood/street/etc, with the complaints listed there and preferably summarised in some way (i.e. "8 pothole complaints", "9 traffic light complaints, etc.) That kind of view might be useful if I was a resident of the place. You would still have to deal with pagination, especially if the number of complaints is large, but that's still going to be far fewer pages than if you have one for every complaint individually.
-
Just stating a couple of facts and a couple of things that I believe about those facts..... I'll be clear to state the parts that are beliefs below.
-
If you have a lot of thin content pages on a website then you run the risk of Google seeing those thin content pages and slapping the domain with a Panda problem. I believe that can cause reduced rankings across the entire domain.
-
Google recently said that they are going to stop following the links on noindex pages. From that, I believe that some pagerank will be lost from every link that enters them. I believe that can result in lower rankings for the entire domain.
If I owned the site above. I would place all of these pages where they can be safely noindexed without causing a loss of pagerank and not produce a Panda problem. That would require them to be in a subdomain that is noindexed or on another domain that is no indexed.
That's what I would do with these pages.
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
To remove or not remove a redirected page from index
We have a promotion landing page which earned some valuable inbound links. Now that the promotion is over, we have redirected this page to a current "evergreen" page. But in the search results page on Google, the original promotion landing page is still showing as a top result. When clicked, it properly redirects to the newer evergreen page. But, it's a bit problematic for the original promo page to show in the search results because the snippet mentions specifics of the promo which is no longer active. So, I'm wondering what would be the net impact of using the "removal request " tool for the original page in GSC. If we don't use that tool, what kind of timing might we expect before the original page drops out of the results in favor of the new redirected page? And if we do use the removal tool on the original page, will that negate what we are attempting to do by redirecting to the new page, with regard to preserving inbound link equity?
Intermediate & Advanced SEO | | seoelevated0 -
Fetch as Google -- Does not result in pages getting indexed
I run a exotic pet website which currently has several types of species of reptiles. It has done well in SERP for the first couple of types of reptiles, but I am continuing to add new species and for each of these comes the task of getting ranked and I need to figure out the best process. We just released our 4th species, "reticulated pythons", about 2 weeks ago, and I made these pages public and in Webmaster tools did a "Fetch as Google" and index page and child pages for this page: http://www.morphmarket.com/c/reptiles/pythons/reticulated-pythons/index While Google immediately indexed the index page, it did not really index the couple of dozen pages linked from this page despite me checking the option to crawl child pages. I know this by two ways: first, in Google Webmaster Tools, if I look at Search Analytics and Pages filtered by "retic", there are only 2 listed. This at least tells me it's not showing these pages to users. More directly though, if I look at Google search for "site:morphmarket.com/c/reptiles/pythons/reticulated-pythons" there are only 7 pages indexed. More details -- I've tested at least one of these URLs with the robot checker and they are not blocked. The canonical values look right. I have not monkeyed really with Crawl URL Parameters. I do NOT have these pages listed in my sitemap, but in my experience Google didn't care a lot about that -- I previously had about 100 pages there and google didn't index some of them for more than 1 year. Google has indexed "105k" pages from my site so it is very happy to do so, apparently just not the ones I want (this large value is due to permutations of search parameters, something I think I've since improved with canonical, robots, etc). I may have some nofollow links to the same URLs but NOT on this page, so assuming nofollow has only local effects, this shouldn't matter. Any advice on what could be going wrong here. I really want Google to index the top couple of links on this page (home, index, stores, calculator) as well as the couple dozen gene/tag links below.
Intermediate & Advanced SEO | | jplehmann0 -
22 Pages 7 Indexed
So I submitted my sitemap to Google twice this week the first time everything was just peachy, but when I went back to do it again Google only indexed 7 out of 22. The website is www.theinboundspot.com. My MOZ Campaign shows no issues and Google Webmaster shows none. Should I just resubmit it?
Intermediate & Advanced SEO | | theinboundspot1 -
Do search engine consider this duplicate or thin content?
I operate an eCommerce site selling various equipment. We get product descriptions and various info from the manufacturer's websites offered to the dealers. Part of that info is in the form of User Guides and Operational Manuals downloaded in pdf format written by the manufacturer, then uploaded to our site. Also we embed and link to videos that are hosted on the manufacturer's respective YouTube or Vimeo channels. This is useful content for our customers.
Intermediate & Advanced SEO | | MichaelFactor
My questions are: Does this type of content help our site by offering useful info, or does it hurt our SEO due to it being thin and or duplicate content? Or does the original content publishers get all the benefit? Is there any benefit to us publishing this stuff? What exactly is considered "thin content"?0 -
Duplicate content within sections of a page but not full page duplicate content
Hi, I am working on a website redesign and the client offers several services and within those services some elements of the services crossover with one another. For example, they offer a service called Modelling and when you click onto that page several elements that build up that service are featured, so in this case 'mentoring'. Now mentoring is common to other services therefore will feature on other service pages. The page will feature a mixture of unique content to that service and small sections of duplicate content and I'm not sure how to treat this. One thing we have come up with is take the user through to a unique page to host all the content however some features do not warrant a page being created for this. Another idea is to have the feature pop up with inline content. Any thoughts/experience on this would be much appreciated.
Intermediate & Advanced SEO | | J_Sinclair0 -
Please help :) Troubles getting 3 types of content de-indexed
Hi there,
Intermediate & Advanced SEO | | Ltsmz
I know that it takes time and I have already submitted a URL removal request 3-4 months ago.
But I would really appreciate some kind advice on this topic. Thank you in advance to everyone who contributes! 1) De-indexing archives Google had indexed all my:
/tag/
/authorname/
archives. I have set them as no-index a few months ago but they still appear in search engine.
Is there anything I can do to speed up this de-indexing? 2) De-index /plugins/ folder in wordpress site They have also indexed all my /plugins/ folder. So I have added a disallow /plugin/ in my robots.txt 3-4 months ago, but /plugins/ still appear in search engine. What can I do to get the /plugins/ folder de-indexed?
Is my disallow /plugins/ in robots.txt making it worse because google has already indexed it and not it can't access the folder? How do you solve this? 3) De-index a subdomain I had created a subdomain containing adult content, and have it completely deleted it from my cpanel 3months ago, but it still appears in search engines. Anything else I can do to get it de-indexed? Thank you in advance for your help!0 -
Need help with huge spike in duplicate content and page title errors.
Hi Mozzers, I come asking for help. I've had a client who's reported a staggering increase in errors of over 18,000! The errors include duplicate content and page titles. I think I've found the culprit and it's the News & Events calender on the following page: http://www.newmanshs.wa.edu.au/news-events/events/07-2013 Essentially each day of the week is an individual link, and events stretching over a few days get reported as duplicate content. Do you have any ideas how to fix this issue? Any help is much appreciated. Cheers
Intermediate & Advanced SEO | | bamcreative0 -
Can too many "noindex" pages compared to "index" pages be a problem?
Hello, I have a question for you: our website virtualsheetmusic.com includes thousands of product pages, and due to Panda penalties in the past, we have no-indexed most of the product pages hoping in a sort of recovery (not yet seen though!). So, currently we have about 4,000 "index" page compared to about 80,000 "noindex" pages. Now, we plan to add additional 100,000 new product pages from a new publisher to offer our customers more music choice, and these new pages will still be marked as "noindex, follow". At the end of the integration process, we will end up having something like 180,000 "noindex, follow" pages compared to about 4,000 "index, follow" pages. Here is my question: can this huge discrepancy between 180,000 "noindex" pages and 4,000 "index" pages be a problem? Can this kind of scenario have or cause any negative effect on our current natural SEs profile? or is this something that doesn't actually matter? Any thoughts on this issue are very welcome. Thank you! Fabrizio
Intermediate & Advanced SEO | | fablau0