Noindexing Thin Content Pages: Good or Bad?
-
If you have massive pages with super thin content (such as pagination pages) and you noindex them, once they are removed from googles index (and if these pages aren't viewable to the user and/or don't get any traffic) is it smart to completely remove them (404?) or is there any valid reason that they should be kept?
If you noindex them, should you keep all URLs in the sitemap so that google will recrawl and notice the noindex tag?
If you noindex them, and then remove the sitemap, can Google still recrawl and recognize the noindex tag on their own?
-
Sometimes you need to leave the crawl path open to Googlebot so they can get around the site. A specific example that may be relevant to you is in pagination. If you have 100 products and are only showing 10 on the first page Google will not be able to reach the other 90 product pages as easily if you block paginated pages in the robots.txt. Better options in such a case might be a robots noindex,follow meta tag, rel next/prev tags, or a "view all" canonical page.
If these pages aren't important to the crawlability of the site, such as internal search results, you could block them in the robots.txt file with little or no issues, and it would help to get them out of the index. If they aren't useful for spiders or users, or anything else, then yes you can and should probably let them 404, rather than blocking.
Yes, I do like to leave the blocked or removed URLs in the sitemap for just a little while to ensure Googlebog revisits them and sees the noindex tag, 404 error code, 301 redirect, or whatever it is they need to see in order to update their index. They'll get there on their own eventually, but I find it faster to send them to the pages myself. Once Googlebot visits these URls and updates their index you should remove them from your sitemaps.
-
If you want to noindex any of your pages, there is no way that Google or any other search engines will think something is fishy. Its up to the webmaster to decide what and what not to get indexed from his website. If you implement page level noindex, the link juice will still flow to the page but if you also have nofollow along with noindex, the link juice will flow to the page but will be contained on the page itself and will not be passed on the links that flow out of that page.
I conclude by saying, there is nothing wrong in making the pages non-indexable.
Here is an interesting discussion related to this on Moz:
http://moz.com/community/q/noindex-follow-is-a-waste-of-link-juice
Hope it helps.
Best,
Devanur Rafi
-
Devanur,
What I am asking is if the robots/google will view it as a negative thing for noindexing pages and still trying to pass the link juice, even though the pages aren't even viewable to the front end user.
-
If you wish not to show these pages even to the front end user, you can just block them using the page level robots meta tag so that these pages will never be indexed by the search engines as well.
Best,
Devanur Rafi
-
Yes, but what if these pages aren't even viewable to the front end user?
-
Hi there, it is a very good idea to block any and all the pages that do not provide any useful content to the visitors and especially when they are very thin content wise. So the idea is to keep away low quality content that does no good to the visitor, from the Internet. Search engines would love every webmaster doing so.
However, sometimes, no matter how thin the content is on some pages, they still provide good information to the visitors and serve the purpose of the visit. In this case, you can provide contextual links to those pages and add the nofollow attribute to the link. Of course you should ideally be implementing the page level blocking using the robots meta tag on those pages. I do not think you should return a 404 on these pages as there is no need to do so. When a page level blocking is implemented, Google will not index the blocked content even if it finds a third party reference to it from elsewhere on the Internet.
If you have implemented the page level noindex using the robots meta tag, there is no need to go for a sitemap with these URLs. With noindex in place, as I mentioned above, Google will not index the content even if it discovers the page using a reference from anywhere on the Internet.
Hope it helps my friend.Best,Devanur Rafi
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplication content management across a subdir based multisite where subsites are projects of the main site and naturally adopt some ideas and goals from it
Hi, I have the following problem and would like which would be the best solution for it: I have a site codex21.gal that is actually part of a subdirectories based multisite (galike.net). It has a domain mapping setup, but it is hosted on a folder of galike.net multisite (galike.net/codex21). My main site (galike.net) works as a frame-brand for a series of projects aimed to promote the cultural & natural heritage of a region in NW Spain through creative projects focused on the entertainment, tourism and educational areas. The projects themselves will be a concretion (put into practice) of the general views of the brand, that acts more like a company brand. CodeX21 is one of those projects, it has its own logo, etc, and is actually like a child brand, yet more focused on a particular theme. I don't want to hide that it makes part of the GALIKE brand (in fact, I am planning to add the Galike logo to it, and a link to the main site on the menu). I will be making other projects, each of them with their own brand, hosted in subsites (subfolders) of galike.net multisites. Not all of them might have their own TLD mapped, some could simply be www.galike.net/projectname. The project codex21.gal subsite might become galike.net/codex21 if it would be better for SEO. Now, the problem is that my subsite codex21.gal re-states some principles, concepts and goals that have been defined (in other words) in the main site. Thus, there are some ideas (such as my particular vision on the possibilities of sustainable exploitation of that heritage, concepts I have developed myself as "narrative tourism" "geographical map as a non lineal story" and so on) that need to be present here and there on the subsite, since it is also philosophy of the project. BUT it seems that Google can penalise overlapping content in subdirectories based multisites, since they can seem a collection of doorways to access the same product (*) I have considered the possibility to substitute those overlapping ideas with links to the main page of the site, thought it seems unnatural from the user point of view to be brought off the page to read a piece of info that actually makes part of the project description (every other child project of Galike might have the same problem). I have considered also taking the subsite codex21 out of the network and host it as a single site in other server, but the problem of duplicated content might persist, and anyway, I should link it to my brand Galike somewhere, because that's kind of the "production house" of it. So which would be the best (white hat) strategy, from a SEO point of view, to arrange this brand-project philosophy overlapping? (*) “All the same IP address — that’s really not a problem for us. It’s really common for sites to be on the same IP address. That’s kind of the way the internet works. A lot of CDNs (content delivery networks) use the same IP address as well for different sites, and that’s also perfectly fine. I think the bigger issue that he might be running into is that all these sites are very similar. So, from our point of view, our algorithms might look at that and say “this is kind of a collection of doorway sites” — in that essentially they’re being funnelled toward the same product. The content on the sites is probably very similar. Then, from our point of view, what might happen is we will say we’ll pick one of these pages and index that and show that in the search results. That might be one variation that we could look at. In practice that wouldn’t be so problematic because one of these sites would be showing up in the search results. On the other hand, our algorithm might also be looking at this and saying this is clearly someone trying to overdo things with a collection of doorway sites and we’ll demote all of them. So what I recommend doing here is really trying to take a step back and focus on fewer sites and making those really strong, and really good and unique. So that they have unique content, unique products that they’re selling. So then you don’t have this collection of a lot of different sites that are essentially doing the same thing.” (John Mueller, Senior Webmaster Trend Analyst at Google. https://www.youtube.com/watch?time_continue=1&v=kQIyk-2-wRg&feature=emb_logo)
White Hat / Black Hat SEO | | PabloCulebras0 -
Back links to pages on our site that don't exist on forums we haven't used with irrelevant product anchor text
Hi, I have a recurring issue that I can't find a reason for. I have a website that has over 7k backlinks that I monitor quite closely. Each month there are additional links on third party forums that have no relevance to the site or subject matter that are as a result toxic. Our clients site is a training site yet these links are appearing on third party sites like http://das-forum-der-musik.de/mineforum/ and have anchor text with "UGG boots for sale" to pages on our url listed as /mensuggboots.html that obviously don't exist. Each month, I try to contact the site owners and then I add them to Google using the disavow tool. Two months later they are gone and then are replaced with new backlinks on a number of different forum websites. Quite random but always relating to UGG boots. There are at least 100 extra links each month. Can anyone suggest why this is happening? Has anyone seen this kind of activity before? Is it possibly black hat SEO being performed by a competitor? I just don't understand why our URL is listed. To be fair, there are other websites linked to using the same terms that aren't ours and are also of a different theme so I don't understand what the "spammer" is trying to achieve. Any help would be appreciated.
White Hat / Black Hat SEO | | rufo
KInd Regards
Steve0 -
Good vs Bad Web directories
Hi this blog post Rand mentions a list of bad web directories - I asked couple of years ago if there is an updated list as some of these (Alive Directory for example) do not seem to be blacklisted anymore and are coming up in Google searches etc? It seems due to old age of the blog post (7 years ago ) the comments are not responded to. Would anyone be able to advise if which of these good directories to use? https://moz.com/blog/what-makes-a-good-web-directory-and-why-google-penalized-dozens-of-bad-ones
White Hat / Black Hat SEO | | IsaCleanse0 -
Plugin to duplicate CMS pages, changing the location
Hi all, We have recently noticed a rise in local business websites using a plugin to duplicate hundreds of pages changing only the location in the h1 tag and the page description, we're pretty sure this is a black hat technique allowing them to rank for all locations (although the duplicate page content must not be doing them any favours). An example of this is http://www.essexcarrecovery.co.uk We would like to know what plugin they are using as we think there may be better ways to use this, we may be able to create original location pages faster than we do now? Also why does not seem to be too detrimental to the businesses SEO as surely this method should be damaging?
White Hat / Black Hat SEO | | birdmarketing0 -
Are All Paid Links and Submissions Bad?
My company was recently approached by a website dedicated to delivering information and insights about our industry. They asked us if we wanted to pay for a "company profile" where they would summarize our company, add a followed link to our site, and promote a giveaway for us. This website is very authoritative and definitely provides helpful use to its audience. How can this website get away with paid submissions like this? Doesn't that go against everything Google preaches? If I were to pay for a profile with them, would I request for a "nofollow" link back to my site?
White Hat / Black Hat SEO | | jampaper1 -
Can I Use Meta NoIndex to Block Unwanted Links?
I have a forum thread on my site that is completely user generated, not spammy at all, but it is attracting about 45 backlinks from really spammy sites. Usually when this happens, the thread is created by a spammer and I just 404 it. But in this instance, the thread is completely legit, and I wouldn't want to 404 it because users could find it useful. If I add a meta noindex, nofollow tag to the header, will the spammy pagerank still be passed? How best can I protect myself from these low quality backlinks? I don't want to get slapped by Penguin! **Note: I cannot find contact information from the spam sites and it's in a foreign language.
White Hat / Black Hat SEO | | TMI.com0 -
Is using Zeus's gateway feature to display contents from the different URL OK to do?
I've been writing a blog on free hosting blog platform and planning to migrate that under my domain name as directory. myblog.ABCD.com to www.mydomain.com/myblog now, I've learned that my Zeus server has a way to show myblog.ABCD.com at mydomain.com/myblog without transferring anything by using the Gateway feature. This will save a lot of time and hassle for me, but my question is if this is ok to do?
White Hat / Black Hat SEO | | HypermediaSystems
Is there a chance that this could be considered a blackhat even though the content is mine? From the Zeus documentation:
"Gateway aliases enable users to request files from the new
web server, and receive them as if they were on the new server, when they are
still located on the legacy server. To the user, the files appear to be located on
the new server. " Thank you.0 -
Is bad English detected by Google
Hi, I am based in the UK and in a very competitive market - van leasing - and I am thinking about using an Indian SEO company for my ongoing SEO. They have sent me some sample artilces that they have written for link building and the English is not good. Do you think that google can tell the difference between a well written article and a poorly written article? Will the fact that articles are poorly writtem mean we will lose potential value from the link? Any input would be much appreciated. Regards John J
White Hat / Black Hat SEO | | Johnnyh0