Noindexing Thin Content Pages: Good or Bad?
-
If you have massive pages with super thin content (such as pagination pages) and you noindex them, once they are removed from googles index (and if these pages aren't viewable to the user and/or don't get any traffic) is it smart to completely remove them (404?) or is there any valid reason that they should be kept?
If you noindex them, should you keep all URLs in the sitemap so that google will recrawl and notice the noindex tag?
If you noindex them, and then remove the sitemap, can Google still recrawl and recognize the noindex tag on their own?
-
Sometimes you need to leave the crawl path open to Googlebot so they can get around the site. A specific example that may be relevant to you is in pagination. If you have 100 products and are only showing 10 on the first page Google will not be able to reach the other 90 product pages as easily if you block paginated pages in the robots.txt. Better options in such a case might be a robots noindex,follow meta tag, rel next/prev tags, or a "view all" canonical page.
If these pages aren't important to the crawlability of the site, such as internal search results, you could block them in the robots.txt file with little or no issues, and it would help to get them out of the index. If they aren't useful for spiders or users, or anything else, then yes you can and should probably let them 404, rather than blocking.
Yes, I do like to leave the blocked or removed URLs in the sitemap for just a little while to ensure Googlebog revisits them and sees the noindex tag, 404 error code, 301 redirect, or whatever it is they need to see in order to update their index. They'll get there on their own eventually, but I find it faster to send them to the pages myself. Once Googlebot visits these URls and updates their index you should remove them from your sitemaps.
-
If you want to noindex any of your pages, there is no way that Google or any other search engines will think something is fishy. Its up to the webmaster to decide what and what not to get indexed from his website. If you implement page level noindex, the link juice will still flow to the page but if you also have nofollow along with noindex, the link juice will flow to the page but will be contained on the page itself and will not be passed on the links that flow out of that page.
I conclude by saying, there is nothing wrong in making the pages non-indexable.
Here is an interesting discussion related to this on Moz:
http://moz.com/community/q/noindex-follow-is-a-waste-of-link-juice
Hope it helps.
Best,
Devanur Rafi
-
Devanur,
What I am asking is if the robots/google will view it as a negative thing for noindexing pages and still trying to pass the link juice, even though the pages aren't even viewable to the front end user.
-
If you wish not to show these pages even to the front end user, you can just block them using the page level robots meta tag so that these pages will never be indexed by the search engines as well.
Best,
Devanur Rafi
-
Yes, but what if these pages aren't even viewable to the front end user?
-
Hi there, it is a very good idea to block any and all the pages that do not provide any useful content to the visitors and especially when they are very thin content wise. So the idea is to keep away low quality content that does no good to the visitor, from the Internet. Search engines would love every webmaster doing so.
However, sometimes, no matter how thin the content is on some pages, they still provide good information to the visitors and serve the purpose of the visit. In this case, you can provide contextual links to those pages and add the nofollow attribute to the link. Of course you should ideally be implementing the page level blocking using the robots meta tag on those pages. I do not think you should return a 404 on these pages as there is no need to do so. When a page level blocking is implemented, Google will not index the blocked content even if it finds a third party reference to it from elsewhere on the Internet.
If you have implemented the page level noindex using the robots meta tag, there is no need to go for a sitemap with these URLs. With noindex in place, as I mentioned above, Google will not index the content even if it discovers the page using a reference from anywhere on the Internet.
Hope it helps my friend.Best,Devanur Rafi
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Pharma Hack/Grey hat SEO. Cannot get site to rank, tons of incoming bad links
I have been working on a website trying to get it to show up in the SERPs again. It is being indexed which is great, it has some errors that I'm fixing now. But for the most part it should be ranking. It don't show any penalties going on, but when I did a backlink search we keep getting the cialis, viagra etc inbound links. First thought was Pharma Hack. But it's not a WP site and I recently rebuilt it. So whatever bad code could have been there it's not anymore. It doesn't show up in google either for the search site:www.mysite.com viagra cialis etc... So I'm wondering if anyone has any insight in a direction to point me? I don't understand what would be causing this to still not rank. Only thing it ranks for is it's name. Any suggestions would be very appreciated.
White Hat / Black Hat SEO | | WeBuyCars.com0 -
Noindexed Pages with External Links Pointing to it: Does the juice still pass through?
I have a site with many many pages that have very thin content, yet they are useful for users/visitors. Those pages also have many external links pointing to them from reputable and authoritative websites. If i were to noindex/follow these pages, will the juice/value from the external links still pass through just as if the page didn't have the noindex tag? Please let me know!
White Hat / Black Hat SEO | | juicyresults0 -
Are All Paid Links and Submissions Bad?
My company was recently approached by a website dedicated to delivering information and insights about our industry. They asked us if we wanted to pay for a "company profile" where they would summarize our company, add a followed link to our site, and promote a giveaway for us. This website is very authoritative and definitely provides helpful use to its audience. How can this website get away with paid submissions like this? Doesn't that go against everything Google preaches? If I were to pay for a profile with them, would I request for a "nofollow" link back to my site?
White Hat / Black Hat SEO | | jampaper1 -
Duplicate Content for e-commerce help
Hi. I know I have duplicate content issues and Moz has shown me the issues on ecommerce websites. However a large number of these issues are for variations of the same product. For example a blue, armani t-shirt can be found on armani page, t-shirt page, armani t-shirt page and it also shows links for the duplicates due to sizing variations. Is it possible or even worthwhile working on these issues? Thanks
White Hat / Black Hat SEO | | YNWA0 -
Is bad English detected by Google
Hi, I am based in the UK and in a very competitive market - van leasing - and I am thinking about using an Indian SEO company for my ongoing SEO. They have sent me some sample artilces that they have written for link building and the English is not good. Do you think that google can tell the difference between a well written article and a poorly written article? Will the fact that articles are poorly writtem mean we will lose potential value from the link? Any input would be much appreciated. Regards John J
White Hat / Black Hat SEO | | Johnnyh0 -
Need advice on best strategy for removing these bad links.
Heres the scenario... We recently took on a new client who's previous seo company had partaken in some dodgy link building tactics. They appear to have done some blog comment spam, very poorly. The situation we are now in is this: We have a site with an internal page deemed more important than the homepage (the homepage has 60 linking root domains and the internal page 879). It looks as though the previous seo company submitted a disavow request, theres a message in webmaster tools from a few weeks back saying it had been received, but no further correspondence. I have doubts as to whether this disavow request was done correctly... Plus im not sure that Google has issued the site a warning yet as they are ranking position one for the keyword on the internal page. Our clients want us to handle this in the correct manner, whether it be to simply ignore it and wait for Google to send a warning about the links, remove the offending internal page and leave a 404, or try to disavow the links that google doesnt know about yet from 800+ websites. Suggestions for the best practice for dealing with this situation? Any advice is much appreciated, Thanks, Hayley.
White Hat / Black Hat SEO | | Silkstream0 -
How to know if a link in a directory will be good for my site?
Hi! Some time ago, a friend of my added our site to a directory. I did not notice it until today, when in the search results for my domain name, the directory came in the first page, in the four position. My friend wrote a nice article, describing our bussiness, and the page has a doFollow link. Looking at the metrics of that directory, I found the following: Domain Authority: 70; main page authority: 76; linking domain roots: 1383; total links: 94663 (several anchor texts); facebook shares: 26; facebook likes: 14; tweets: 20; Google +1: 15. The directory accept a free article about a company, does not review it before it is published, but look for duplicated articles representing spam; so one company can only have one listing (in theory). Is there any formula to know if a directory is safe to publish a doFollow link? If they don't review the link I would say is not a good signal, but is there any other factors to take into account?
White Hat / Black Hat SEO | | te_c0 -
Does anyone know of a good link building case study? A B2B focus would be a plus
Looking for a solid analysis of a white-hat campaign that showed tangible results (if one exists).
White Hat / Black Hat SEO | | RiseSEO0