No Index thousands of thin content pages?
-
Hello all!
I'm working on a site that features a service marketed to community leaders that allows the citizens of that community log 311 type issues such as potholes, broken streetlights, etc. The "marketing" front of the site is 10-12 pages of content to be optimized for the community leader searchers however, as you can imagine there are thousands and thousands of pages of one or two line complaints such as, "There is a pothole on Main St. and 3rd."
These complaint pages are not about the service, and I'm thinking not helpful to my end goal of gaining awareness of the service through search for the community leaders. Community leaders are searching for "311 request service", not "potholes on main street".
Should all of these "complaint" pages be NOINDEX'd? What if there are a number of quality links pointing to the complaint pages? Do I have to worry about losing Domain Authority if I do NOINDEX them?
Thanks for any input.
Ken
-
Egol,
Thanks for this. I did consider the sub-domain option and I'm going to discuss this as an option with my team.
Ken
-
Stephan,
There is little Organic Search traffic to these pages but there are a number of links pointing to them. One of the benefits of this type of business is that you're associated with local governments so you do get links from .gov sites. Most go to the service home pages but there are some that drive to the individual issue pages.
The grouping by category is something to think about. I'll discuss with the team.
Thanks!
-
I really like Stpehan's idea of "indexed collections of complaints".
-
Hi Ken,
It depends a little on how the complaints are organised within the site structure, what links they have, and what traffic these pages bring in. Unless you think domain authority is a particularly big factor in the competitive space the site operates in, I wouldn't fixate on DA. Questions you do want to answer:
- Crawl the whole site, preferably using the Google Search Console and/or Google Analytics API with Screaming Frog. Do these complaints bring in (useful) traffic? Surely part of what makes the 311 service useful for community managers is that people in their community can easily comment and see the comments of others? Thinking further down the line, if the site is difficult for people in the community to find, will they use it less, and thus will community managers see less value in the service over time? Indirectly, people leaving complaints is probably a good thing for the service; do they usually do this after searching for "potholes on main street"? This is all guesswork on my part, as I haven't seen the site.
- If you do have a lot of traffic to the complaint pages, is it useful traffic? Could you afford to lose it (because that may happen if you noindex)? Remember to bear in mind the second-order effects: if nobody complains any more, the manager doesn't need a 311 service!
- Do you actually have valuable (external) links to the complaints? We can't guess at that—the only solution is to use Open Site Explorer, ahrefs, Majestic, etc...
Without knowing more, I'll just say: there probably isn't value in having an indexed page for each complaint, but there might be value in having indexed collections of complaints, optimised for neighbourhood or street. So if there are 6 complaints about potholes on main street, a first step might be for each individual complaint-page to canonical back to the page detailing all complaints about main street. And if complaints are really that brief (1 or 2 sentences), eventually I'd prefer to change the site structure altogether, so that each complaint didn't get its own page at all, but that I had one page for each neighbourhood/street/etc, with the complaints listed there and preferably summarised in some way (i.e. "8 pothole complaints", "9 traffic light complaints, etc.) That kind of view might be useful if I was a resident of the place. You would still have to deal with pagination, especially if the number of complaints is large, but that's still going to be far fewer pages than if you have one for every complaint individually.
-
Just stating a couple of facts and a couple of things that I believe about those facts..... I'll be clear to state the parts that are beliefs below.
-
If you have a lot of thin content pages on a website then you run the risk of Google seeing those thin content pages and slapping the domain with a Panda problem. I believe that can cause reduced rankings across the entire domain.
-
Google recently said that they are going to stop following the links on noindex pages. From that, I believe that some pagerank will be lost from every link that enters them. I believe that can result in lower rankings for the entire domain.
If I owned the site above. I would place all of these pages where they can be safely noindexed without causing a loss of pagerank and not produce a Panda problem. That would require them to be in a subdomain that is noindexed or on another domain that is no indexed.
That's what I would do with these pages.
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Best way to link to 1000 city landing pages from index page in a way that google follows/crawls these links (without building country pages)?
Currently we have direct links to the top 100 country and city landing pages on our index page of the root domain.
Intermediate & Advanced SEO | | lcourse
I would like to add in the index page for each country a link "more cities" which then loads dynamically (without reloading the page and without redirecting to another page) a list with links to all cities in this country.
I do not want to dillute "link juice" to my top 100 country and city landing pages on the index page.
I would still like google to be able to crawl and follow these links to cities that I load dynamically later. In this particular case typical site hiearchy of country pages with links to all cities is not an option. Any recommendations on how best to implement?0 -
Best practice to prevent pages from being indexed?
Generally speaking, is it better to use robots.txt or rel=noindex to prevent duplicate pages from being indexed?
Intermediate & Advanced SEO | | TheaterMania0 -
Google indexing "noindex" pages
1 weeks ago my website expanded with a lot more pages. I included "noindex, follow" on a lot of these new pages, but then 4 days ago I saw the nr of pages Google indexed increased. Should I expect in 2-3 weeks these pages will be properly noindexed and it may just be a delay? It is odd to me that a few days after including "noindex" on pages, that webmaster tools shows an increase in indexing - that the pages were indexed in other words. My website is relatively new and these new pages are not pages Google frequently indexes.
Intermediate & Advanced SEO | | khi50 -
Old pages still in index
Hi Guys, I've been working on a E-commerce site for a while now. Let me sum it up : February new site is launched Due to lack of resources we started 301's of old url's in March Added rel=canonical end of May because of huge index numbers (developers forgot!!) Added noindex and robots.txt on at least 1000 urls. Index numbers went down from 105.000 tot 55.000 for now, see screenshot (actual number in sitemap is 13.000) Now when i do site:domain.com there are still old url's in the index while there is a 301 on the url since March! I know this can take a while but I wonder how I can speed this up or am doing something wrong. Hope anyone can help because I simply don't know how the old url's can still be in the index. 4cArHPH.png
Intermediate & Advanced SEO | | ssiebn70 -
How Long Does it Take for Rel Canonical to De-Index / Re-Index a Page?
Hi Mozzers, We have 2 e-commerce websites, Website A and Website B, sharing thousands of pages with duplicate product descriptions. Currently only the product pages on Website B are indexing, and we want Website A indexed instead. We added the rel canonical tag on each of Website B's product pages with a link towards the matching product on Page A. How long until Website B gets de-indexed and Website A gets indexed instead? Did we add the rel canonical tag correctly? Thanks!
Intermediate & Advanced SEO | | Travis-W0 -
Duplicate page content and duplicate pate title
Hi, i am running a global concept that operates with one webpage that has lot of content, the content is also available on different domains, but with in the same concept. I think i am getting bad ranking due to duplicate content, since some of the content is mirrored from the main page to the other "support pages" and they are almost 200 in total. Can i do some changes to work around this or am i just screwed 🙂
Intermediate & Advanced SEO | | smartmedia0 -
Getting 260,000 pages re-indexed?
Hey there guys, I was recently hired to do SEO for a big forum to move the site to a new domain and to get them back up to their ranks after this move. This all went quite well, except for the fact that we lost about 1/3rd of our traffic. Although I expected some traffic to drop, this is quite a lot and I'm wondering what it is. The big keywords are still pulling the same traffic but I feel that a lot of the small threads on the forums have been de-indexed. Now, with a site with 260,000 threads, do I just take my loss and focus on new keywords? Or is there something I can do to get all these threads re-indexed? Thanks!
Intermediate & Advanced SEO | | StefanJDorresteijn0 -
Removing Duplicate Page Content
Since joining SEOMOZ four weeks ago I've been busy tweaking our site, a magento eCommerce store, and have successfully removed a significant portion of the errors. Now I need to remove/hide duplicate pages from the search engines and I'm wondering what is the best way to attack this? Can I solve this in one central location, or do I need to do something in the Google & Bing webmaster tools? Here is a list of duplicate content http://www.unitedbmwonline.com/?dir=asc&mode=grid&order=name http://www.unitedbmwonline.com/?dir=asc&mode=list&order=name
Intermediate & Advanced SEO | | SteveMaguire
http://www.unitedbmwonline.com/?dir=asc&order=name http://www.unitedbmwonline.com/?dir=desc&mode=grid&order=name http://www.unitedbmwonline.com/?dir=desc&mode=list&order=name http://www.unitedbmwonline.com/?dir=desc&order=name http://www.unitedbmwonline.com/?mode=grid http://www.unitedbmwonline.com/?mode=list Thanks in advance, Steve0