Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Noindexing Thin Content Pages: Good or Bad?
-
If you have massive pages with super thin content (such as pagination pages) and you noindex them, once they are removed from googles index (and if these pages aren't viewable to the user and/or don't get any traffic) is it smart to completely remove them (404?) or is there any valid reason that they should be kept?
If you noindex them, should you keep all URLs in the sitemap so that google will recrawl and notice the noindex tag?
If you noindex them, and then remove the sitemap, can Google still recrawl and recognize the noindex tag on their own?
-
Sometimes you need to leave the crawl path open to Googlebot so they can get around the site. A specific example that may be relevant to you is in pagination. If you have 100 products and are only showing 10 on the first page Google will not be able to reach the other 90 product pages as easily if you block paginated pages in the robots.txt. Better options in such a case might be a robots noindex,follow meta tag, rel next/prev tags, or a "view all" canonical page.
If these pages aren't important to the crawlability of the site, such as internal search results, you could block them in the robots.txt file with little or no issues, and it would help to get them out of the index. If they aren't useful for spiders or users, or anything else, then yes you can and should probably let them 404, rather than blocking.
Yes, I do like to leave the blocked or removed URLs in the sitemap for just a little while to ensure Googlebog revisits them and sees the noindex tag, 404 error code, 301 redirect, or whatever it is they need to see in order to update their index. They'll get there on their own eventually, but I find it faster to send them to the pages myself. Once Googlebot visits these URls and updates their index you should remove them from your sitemaps.
-
If you want to noindex any of your pages, there is no way that Google or any other search engines will think something is fishy. Its up to the webmaster to decide what and what not to get indexed from his website. If you implement page level noindex, the link juice will still flow to the page but if you also have nofollow along with noindex, the link juice will flow to the page but will be contained on the page itself and will not be passed on the links that flow out of that page.
I conclude by saying, there is nothing wrong in making the pages non-indexable.
Here is an interesting discussion related to this on Moz:
http://moz.com/community/q/noindex-follow-is-a-waste-of-link-juice
Hope it helps.
Best,
Devanur Rafi
-
Devanur,
What I am asking is if the robots/google will view it as a negative thing for noindexing pages and still trying to pass the link juice, even though the pages aren't even viewable to the front end user.
-
If you wish not to show these pages even to the front end user, you can just block them using the page level robots meta tag so that these pages will never be indexed by the search engines as well.
Best,
Devanur Rafi
-
Yes, but what if these pages aren't even viewable to the front end user?
-
Hi there, it is a very good idea to block any and all the pages that do not provide any useful content to the visitors and especially when they are very thin content wise. So the idea is to keep away low quality content that does no good to the visitor, from the Internet. Search engines would love every webmaster doing so.
However, sometimes, no matter how thin the content is on some pages, they still provide good information to the visitors and serve the purpose of the visit. In this case, you can provide contextual links to those pages and add the nofollow attribute to the link. Of course you should ideally be implementing the page level blocking using the robots meta tag on those pages. I do not think you should return a 404 on these pages as there is no need to do so. When a page level blocking is implemented, Google will not index the blocked content even if it finds a third party reference to it from elsewhere on the Internet.
If you have implemented the page level noindex using the robots meta tag, there is no need to go for a sitemap with these URLs. With noindex in place, as I mentioned above, Google will not index the content even if it discovers the page using a reference from anywhere on the Internet.
Hope it helps my friend.Best,Devanur Rafi
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Posting same content multiple blogs or multiple website - 2018
Submitting same content on multiple site or blog using original source Links. Its good or bad in term on Ranking and SEO. Can we post same content on multiple website with orginal post reference same like Press release site technique.
White Hat / Black Hat SEO | | HuptechWebseo0 -
Schema Markup for regular web pages?
I'm a bit confused about what Schema markup should be applied to such regular, informative web pages.
White Hat / Black Hat SEO | | gray_jedi
We have a few pages describing our technology and solutions. These pages are not products or news articles. And they are not something that should be reviewed/rated. What Schema markup should be used for a standard run-of-the mill web page?
Is there a good reference / tutorial for optimizing the schema markup of an informational website? Any advice is much appreciated, thank you!0 -
I show different versions of the same page to the crawlers and users, but do not want to do anymore
Hello, While Google could not read JavaScript, I created two versions of the same page, one of them is for human and another is for Google. Now I do not want to serve different content to the search engine. But, I am worry if I will lose my traffic value. What is the best way to succeed it without loss? Can you help me?
White Hat / Black Hat SEO | | kipra0 -
Backlinks in Footer - The good, the bad, the ugly.
I tried adding onto a question already listed, however that question stayed where it was and didn't go anywhere close to somewhere others would see it, since it was from 2012. I have a competitor who is completely new, just popped onto the SERPs in December 2015. Now I've wondered how they jumped up so fast without really much in the way of user content. Upon researching them, I saw they have 200 backlinks but 160 of them are from their parent company, and of all places coming from the footer of their parent company. So they get all of the pages of that domain, as backlinks. Everything I've read has told me not to do this, it's going to harm the site bad if anything will discount the links. I'm in no way interested in doing what they did, even if it resulted in page 1 ( which it has done for them ), since I believe that it's only a matter of time, and once that time comes, it won't be a 3 month recovery, it might be worse. What do you all think? My question or discussion is why hasn't this site been penalized yet, will they be penalized and if not, why wouldn't they be? **What is the good, bad and ugly of backlinks in the footer: ** Good Bad Ugly
White Hat / Black Hat SEO | | Deacyde0 -
How to 301 redirect from old domain and their pages to new domain and pages?
Hi i am a real newbie to this and i hope for a guide on how to do this. I seen a few moz post and is quiet confusing hopefully somebody able to explain it in layman terms to me. I would like to 301 redirect this way, both website contain the same niche. oldwebsite.com > newwebsite.com and also its pages..... oldwebsite.com/test >newwebsite.com/test So my question here is i would like to host my old domain and its pages in my new website hosting in order to redirect to my new domain and its pages how do i do that? would my previous page link overwrite my new page link? or it add on the juice link? Do i need to host the whole old domain website into my new hosting in order to redirect the old pages? really confusing here, thanks!
White Hat / Black Hat SEO | | andzon0 -
Does Ezine articles still make any good?
In the past many of the articles we posted in our blog we post on Ezine articles. After Penguin still make any sense to post on Ezine? Can the post on Ezine make any bad or Good to our ranking? What kind of tactics are guys using to promote articles/post in your blog?
White Hat / Black Hat SEO | | Felip30 -
Does having the same descrition for different products a bad thing the titles are all differnent but but they are the same product but with different designs on them does this count as duplicate content?
does having the same description for different products a bad thing the titles are all different but but they are the same product but with different designs on them does this count as duplicate content?
White Hat / Black Hat SEO | | Casefun1 -
White Papers! Is this still good for SEO
Does publishing a white paper good for SEO? We are trying to decide to publish one or not for the purpose of SEO. If it will not help, we will spend money for other things.
White Hat / Black Hat SEO | | AppleCapitalGroup0