To noindex and follow or noindex no follow?
-
We have to greatly scale back on one of our services and focus on the other more successful ones. I need to figure out what to do with all the pages relating to the service we are cutting back.
Just to be clear, we aren't getting rid of the service. So they still want the pages on the website, but it is better for us to have more link juice going to the other service pages, more of our content ratio to be around the more profitable services, etc.
So, should I no-index/no-follow all the pages relating to the service we are cutting back on? Or should I no-index/follow all the pages relating the service we are cutting back on?
Thanks,
Ruben
-
+1 for EGOL
I would play with the pricing strategy instead of using noindex and nofollow on my site. These unwanted service pages might have valuable Page Authority and pass link juice in internal navigation, so noindex and nofollow can potentially hurt the overall organic search performance of your site.
If you don't want Google to crawl these pages looking for new information, simply block crawling in robots.txt but leave them in Google's index.
-
If I have a limited supply of an item, I raise prices so that I make a maximum amount from the stock on hand. I do the same if I am selling a service that is billed by the hour or by the job and I need to limit its availability. I allow the customer to decide if they want what I have at the price I want to receive.
If I have other products that are close to what I am short on, I will remove the short supply product from the category page competition. That will allow people on my site to see comparable products, but anyone who is searching for that product by name might still find my item in search. For that reason, I would allow one or two links to those pages on the site, but not give that item a "noindex".
The above are pricing plays.
For SEO plays, limiting the number of links that enter the pages that are in limited supply will allow pagerank that originally went into them to flow to other pages. This was very effective ten years ago when pagerank flow was important. Today there are a lot of other items in the algo and on-site connectivity to a page is not as important. However, cutting down the internal links into a page still might be slightly valuable.
-
I would think no-index/no-follow would make the most sense in this case.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Internal search pages (and faceted navigation) solutions for 2018! Canonical or meta robots "noindex,follow"?
There seems to conflicting information on how best to handle internal search results pages. To recap - they are problematic because these pages generally result in lots of query parameters being appended to the URL string for every kind of search - whilst the title, meta-description and general framework of the page remain the same - which is flagged in Moz Pro Site Crawl - as duplicate, meta descriptions/h1s etc. The general advice these days is NOT to disallow these pages in robots.txt anymore - because there is still value in their being crawled for all the links that appear on the page. But in order to handle the duplicate issues - the advice varies into two camps on what to do: 1. Add meta robots tag - with "noindex,follow" to the page
Intermediate & Advanced SEO | | SWEMII
This means the page will not be indexed with all it's myriad queries and parameters. And so takes care of any duplicate meta /markup issues - but any other links from the page can still be crawled and indexed = better crawling, indexing of the site, however you lose any value the page itself might bring.
This is the advice Yoast recommends in 2017 : https://yoast.com/blocking-your-sites-search-results/ - who are adamant that Google just doesn't like or want to serve this kind of page anyway... 2. Just add a canonical link tag - this will ensure that the search results page is still indexed as well.
All the different query string URLs, and the array of results they serve - are 'canonicalised' as the same.
However - this seems a bit duplicitous as the results in the page body could all be very different. Also - all the paginated results pages - would be 'canonicalised' to the main search page - which we know Google states is not correct implementation of canonical tag
https://webmasters.googleblog.com/2013/04/5-common-mistakes-with-relcanonical.html this picks up on this older discussion here from 2012
https://moz.com/community/q/internal-search-rel-canonical-vs-noindex-vs-robots-txt
Where the advice was leaning towards using canonicals because the user was seeing a percentage of inbound into these search result pages - but i wonder if it will still be the case ? As the older discussion is now 6 years old - just wondering if there is any new approach or how others have chosen to handle internal search I think a lot of the same issues occur with faceted navigation as discussed here in 2017
https://moz.com/blog/large-site-seo-basics-faceted-navigation1 -
How to handle potentially thousands (50k+) of 301 redirects following a major site replacement
We are looking for the very best way of handling potentially thousands (50k+) of 301 redirects following
Intermediate & Advanced SEO | | GeezerG
a major site replacement and I mean total replacement. Things you should know
Existing domain has 17 years history with Google but rankings have suffered over the past year and yes we know why. (and the bitch is we paid a good sized SEO company for that ineffective and destructive work)
The URL structure of the new site is completely different and SEO friendly URL's rule. This means that there will be many thousands of historical URL's (mainly dynamic ones) that will attract 404 errors as they will not exist anymore. Most are product profile pages and the God Google has indexed them all. There are also many links to them out there.
The new site is fully SEO optimised and is passing all tests so far - however there is a way to go yet. So here are my thoughts on the possible ways of meeting our need,
1: Create 301 redirects for each an every page in the .htaccess file that would be one huge .htaccess file 50,000 lines plus - I am worried about effect on site speed.
2: Create 301 redirects for each and every unused folder, and wildcard the file names, this would be a single redirect for each file in each folder to a single redirect page
so the 404 issue is overcome but the user doesn't open the precise page they are after.
3: Write some code to create a hard copy 301 index.php file for each and every folder that is to be replaced.
4: Write code to create a hard copy 301 .php file for each and every page that is to be replaced.
5: We could just let the pages all die and list them with Google to advise of their death.
6: We could have the redirect managed by a database rather than .htaccess or single redirect files. Probably the most challenging thing will be to load the data in the first place, but I assume this could be done programatically - especially if the new URL can be inferred from the old. Many be I am missing another, simpler approach - please discuss0 -
Can noindexed pages accrue page authority?
My company's site has a large set of pages (tens of thousands) that have very thin or no content. They typically target a single low-competition keyword (and typically rank very well), but the pages have a very high bounce rate and are definitely hurting our domain's overall rankings via Panda (quality ranking). I'm planning on recommending we noindexed these pages temporarily, and reindex each page as resources are able to fill in content. My question is whether an individual page will be able to accrue any page authority for that target term while noindexed. We DO want to rank for all those terms, just not until we have the content to back it up. However, we're in a pretty competitive space up against domains that have been around a lot longer and have higher domain authorities. Like I said, these pages rank well right now, even with thin content. The worry is if we noindex them while we slowly build out content, will our competitors get the edge on those terms (with their subpar but continually available content)? Do you think Google will give us any credit for having had the page all along, just not always indexed?
Intermediate & Advanced SEO | | THandorf0 -
Sitemap contains Meta NOINDEX pages - Good or bad?
Hi, Our sitemap is created by our e-commerce software - Magento - We are probably going to make a lot of products Meta No Index for the moment, until all the content has been corrected on them - but by default, as they are enabled, they will appear in Sitemap. So, the question is: "Should pages that are Meta NOINDEX be listed in a sitemap"? Does it matter? thanks!
Intermediate & Advanced SEO | | bjs20100 -
Same Branding, Same Followers, New Domain After Penalty... Your Opinion Please
I know I've asked a similar question in the past but I'm still trying to figure out what to do with my website. I've got a website at thewebhostinghero.com that's been penalized by both Panda and Penguin. I cleaned up the link profile and submitted a reconsideration request but it was denied. I finally found a handful of additional bad links and I submitted a new disavow + reconsideration request a few days ago and I am still waiting. That said, after submitting the initial disavow request, the traffic has completely gone and while I expected a drop in traffic, I also expected my penalty to be lifted but it was not the case. Even though the penalty might be lifted this time, I think that making the website profitable again could be harder than creating a new website. So here's my questioning: The website's domain is thewebhostinghero.com but I also happen to own webhostinghero.com which I bought later for $5000 (yes you read that right). The domain "webhostinghero.com" is completely clean as it's only redirecting to thewebhostinghero.com. I would like to use webhostinghero.com as a completely new website and not redirect any traffic from thewebhostinghero.com as to not pass any bad link juice. Pros: Keeping the same branding image (which cost me $$$) Keeping the 17,000+ Facebook followers Keeping the same Google+ and Twitter accounts Keeping and monetizing a domain that cost me $5000 webhostinghero.com is a better domain than thewebhostinghero.com Cons: Will create confusion between the 2 websites Any danger of being flagged as duplicate or something? Do you see any other potential issues with this? What's your opinion/advice? P.S. Sorry for my english...
Intermediate & Advanced SEO | | sbrault740 -
Will disallowing in robots.txt noindex a page?
Google has indexed a page I wish to remove. I would like to meta noindex but the CMS isn't allowing me too right now. A suggestion o disallow in robots.txt would simply stop them crawling I expect or is it also an instruction to noindex? Thanks
Intermediate & Advanced SEO | | Brocberry0 -
Should I NoIndex NoFollow my BUYNOW page?
Hi, As stated in the title, I am wondering if I should NOINDEX NOFOLLOW my shopping cart page - it is actually a buy now page that receives in the URL the Item ID - only one item per purchase. I received duplication errors so now I added canonical and I wonder if I should simply remove it altogether. Thanks
Intermediate & Advanced SEO | | BeytzNet0 -
De-indexing search results noindex, follow or noindex, nofollow
If search results were not originally blocked with robots.txt, and need to be de-indexed, is it better to use noindex, nofollow or noindex, follow?
Intermediate & Advanced SEO | | nicole.healthline0