Noindex
-
I have been reading a lot of conflicting information on the Link Juice ramifications of using "NoIndex". Can I get some advice for the following situation?
1. I have pages that I do not want indexed on my site. They are lead conversion pages. Just about every page on my site has links to them. If I just apply a standard link, those pages will get a ton of Link Juice that I'd like to allocate to other pages.
2. If I use "nofollow", the pages won't rank, but the link juice evaporates. I get that. I won't use "nofollow"
3. I have read that "noindex, follow" will block the pages in the SERPs, but will pass Link Juice to them. I don't think that I want this either. If I "dead end" the lead form with no navigation or links, will the juice be locked up on the page?
4. I assume that I should block the pages in robots.txt
In order to keep the pages out of the SERPs, and conserve Link Juice, what should I do? Can someone please give me a step by step process with the reasoning for what I should do here?
-
I have a private/login site where all pages are noindex, nofollow. Can I still monitor external site links with Google Analytics?
-
Yes, there is a way to keep them out of the SERPs and restrict them from getting link juice: using noindex + nofollow, but bare in mind you'll be loosing that link juice and impairing it's flow throughout your site, besides indicating Google that you don't "trust" those pages.
A workaround would be consolidating those links.
-
So what you are saying is that there is no way to keep the pages out of the serps and restrict them from getting link juice?
This is nuts. My conversion pages will be getting huge amounts of link juice - there are links to them on every page.
I'm not happy about this. Any workarounds?
-
Using robots.txt won't ensure that your pages are kept out of the SERPs, since any external link to those pages could get them indexed. If you need to make sure, the best way should be the noindex meta tag.
Now, in order not to loose your linkjuice, you should make sure to use "noindex, follow" in your meta, that way you're still preventing the pages from being indexed but you are allowing the juice flow through them.
If you want to pass the less possible juice to those pages, you should try to link them as little as possible or consolidate those links in fewer pages throughout your site.
Here's some useful information on the subject:
Google Says: Yes, You Can Still Sculpt PageRank. No You Can't Do It With Nofollow
Link Consolidation: The New PageRank Sculpting
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to get a large number of urls out of Google's Index when there are no pages to noindex tag?
Hi, I'm working with a site that has created a large group of urls (150,000) that have crept into Google's index. If these urls actually existed as pages, which they don't, I'd just noindex tag them and over time the number would drift down. The thing is, they created them through a complicated internal linking arrangement that adds affiliate code to the links and forwards them to the affiliate. GoogleBot would crawl a link that looks like it's to the client's same domain and wind up on Amazon or somewhere else with some affiiiate code. GoogleBot would then grab the original link on the clients domain and index it... even though the page served is on Amazon or somewhere else. Ergo, I don't have a page to noindex tag. I have to get this 150K block of cruft out of Google's index, but without actual pages to noindex tag, it's a bit of a puzzler. Any ideas? Thanks! Best... Michael P.S., All 150K urls seem to share the same url pattern... exmpledomain.com/item/... so /item/ is common to all of them, if that helps.
Intermediate & Advanced SEO | | 945010 -
Sanity Check: NoIndexing a Boatload of URLs
Hi, I'm working with a Shopify site that has about 10x more URLs in Google's index than it really ought to. This equals thousands of urls bloating the index. Shopify makes it super easy to make endless new collections of products, where none of the new collections has any new content... just a new mix of products. Over time, this makes for a ton of duplicate content. My response, aside from making other new/unique content, is to select some choice collections with KW/topic opportunities in organic and add unique content to those pages. At the same time, noindexing the other 90% of excess collections pages. The thing is there's evidently no method that I could find of just uploading a list of urls to Shopify to tag noindex. And, it's too time consuming to do this one url at a time, so I wrote a little script to add a noindex tag (not nofollow) to pages that share various identical title tags, since many of them do. This saves some time, but I have to be careful to not inadvertently noindex a page I want to keep. Here are my questions: Is this what you would do? To me it seems a little crazy that I have to do this by title tag, although faster than one at a time. Would you follow it up with a deindex request (one url at a time) with Google or just let Google figure it out over time? Are there any potential negative side effects from noindexing 90% of what Google is already aware of? Any additional ideas? Thanks! Best... Mike
Intermediate & Advanced SEO | | 945010 -
What's the best way to noindex pages but still keep backlinks equity?
Hello everyone, Maybe it is a stupid question, but I ask to the experts... What's the best way to noindex pages but still keep backlinks equity from those noindexed pages? For example, let's say I have many pages that look similar to a "main" page which I solely want to appear on Google, so I want to noindex all pages with the exception of that "main" page... but, what if I also want to transfer any possible link equity present on the noindexed pages to the main page? The only solution I have thought is to add a canonical tag pointing to the main page on those noindexed pages... but will that work or cause wreak havoc in some way?
Intermediate & Advanced SEO | | fablau3 -
Are HTML Sitemaps Still Effective With "Noindex, Follow"?
A site we're working on has hundreds of thousands of inventory pages that are generally "orphaned" pages. To reach them, you need to do a lot of faceting on the search results page. They appear in our XML sitemaps as well, but I'd still consider these orphan pages. To assist with crawling and indexation, we'd like to create HTML sitemaps to link to these pages. Due to the nature (and categorization) of these products, this would mean we'll be creating thousands of individual HTML sitemap pages, which we're hesitant to put into the index. Would the sitemaps still be effective if we add a noindex, follow meta tag? Does this indicate lower quality content in some way, or will it make no difference in how search engines will handle the links therein?
Intermediate & Advanced SEO | | mothner0 -
Index or noindex mobile version?
We have a website called imones.lt
Intermediate & Advanced SEO | | FCRMediaLietuva
and we have a mobile version for it m.imones.lt We originally put noindex for m.imones.lt. Is it a good decision or no? We believe that if google indexes both it creates double content. We definitely don't want that? But when someone through google goes to any of imones.lt webpage using smartphone they are redirected to m.imones.lt/whatever Thank you for your opinion.0 -
Mobile Version showing up on Desktop - NoIndex it?
I had a little issue earlier where I found my client's mobile version of their website showing up in the SERPs on my desktop. I asked my programmer to get rid of it. Programmer put a nofollow tag on the link to the mobile site (from the regular website). He also put a noIndex across the whole mobile version of the website. So to double check, I should probably get rid of that noindex on the mobile website right? I think the nofollow should be enough... thoughts? thanks!
Intermediate & Advanced SEO | | Rich_Coffman0 -
Meta NOINDEX... how long before Google drops dupe pages?
Hi, I have a lot of near dupe content caused by URL params - so I have applied: How long will it take for this to take effect? It's been over a week now, I have done some removal with GWT removal tool, but still no major indexed pages dropped. Any ideas? Thanks, Ben
Intermediate & Advanced SEO | | bjs20100 -
Panda Updates - robots.txt or noindex?
Hi, I have a site that I believe has been impacted by the recent Panda updates. Assuming that Google has crawled and indexed several thousand pages that are essentially the same and the site has now passed the threshold to be picked out by the Panda update, what is the best way to proceed? Is it enough to block the pages from being crawled in the future using robots.txt, or would I need to remove the pages from the index using the meta noindex tag? Of course if I block the URLs with robots.txt then Googlebot won't be able to access the page in order to see the noindex tag. Anyone have and previous experiences of doing something similar? Thanks very much.
Intermediate & Advanced SEO | | ianmcintosh0