To noindex and follow or noindex no follow?
-
We have to greatly scale back on one of our services and focus on the other more successful ones. I need to figure out what to do with all the pages relating to the service we are cutting back.
Just to be clear, we aren't getting rid of the service. So they still want the pages on the website, but it is better for us to have more link juice going to the other service pages, more of our content ratio to be around the more profitable services, etc.
So, should I no-index/no-follow all the pages relating to the service we are cutting back on? Or should I no-index/follow all the pages relating the service we are cutting back on?
Thanks,
Ruben
-
+1 for EGOL
I would play with the pricing strategy instead of using noindex and nofollow on my site. These unwanted service pages might have valuable Page Authority and pass link juice in internal navigation, so noindex and nofollow can potentially hurt the overall organic search performance of your site.
If you don't want Google to crawl these pages looking for new information, simply block crawling in robots.txt but leave them in Google's index.
-
If I have a limited supply of an item, I raise prices so that I make a maximum amount from the stock on hand. I do the same if I am selling a service that is billed by the hour or by the job and I need to limit its availability. I allow the customer to decide if they want what I have at the price I want to receive.
If I have other products that are close to what I am short on, I will remove the short supply product from the category page competition. That will allow people on my site to see comparable products, but anyone who is searching for that product by name might still find my item in search. For that reason, I would allow one or two links to those pages on the site, but not give that item a "noindex".
The above are pricing plays.
For SEO plays, limiting the number of links that enter the pages that are in limited supply will allow pagerank that originally went into them to flow to other pages. This was very effective ten years ago when pagerank flow was important. Today there are a lot of other items in the algo and on-site connectivity to a page is not as important. However, cutting down the internal links into a page still might be slightly valuable.
-
I would think no-index/no-follow would make the most sense in this case.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Conditional Noindex for Dynamic Listing Pages?
Hi, We have dynamic listing pages that are sometimes populated and sometimes not populated. They are clinical trial results pages for disease types, some of which don't always have trials open. This means that sometimes the CMS produces a blank page -- pages that are then flagged as thin content. We're considering implementing a conditional noindex -- where the page is indexed only if there are results. However, I'm concerned that this will be confusing to Google and send a negative ranking signal. Any advice would be super helpful. Thanks!
Intermediate & Advanced SEO | | yaelslater0 -
One click links, followed ? anchor text ?
Hello, Just wondering google follows on clicks links (links create in a nice button) and anchor text. Thank you,
Intermediate & Advanced SEO | | seoanalytics0 -
Are HTML Sitemaps Still Effective With "Noindex, Follow"?
A site we're working on has hundreds of thousands of inventory pages that are generally "orphaned" pages. To reach them, you need to do a lot of faceting on the search results page. They appear in our XML sitemaps as well, but I'd still consider these orphan pages. To assist with crawling and indexation, we'd like to create HTML sitemaps to link to these pages. Due to the nature (and categorization) of these products, this would mean we'll be creating thousands of individual HTML sitemap pages, which we're hesitant to put into the index. Would the sitemaps still be effective if we add a noindex, follow meta tag? Does this indicate lower quality content in some way, or will it make no difference in how search engines will handle the links therein?
Intermediate & Advanced SEO | | mothner0 -
To nofollow or follow internal links, that is the question...
"...Whether 'tis Nobler in the mind to suffer the slings and arrows of outrageous fortune or..." Okay, I'll drop the Hamlet riff. I'm working on a site with a forum. Top pages may have 20 to 30 answers. Each answer is by a member with an image/link and a name link to their member profile. A member profile may contain alot of info or none. We've noiondexed memeber profile pages, yet we still have these links to member profile pages. Is it better to nofollow these internal links to profile pages or what? Again, with 25 answers on a page and two links per answer to each member profile (image and name), that's a ton of internal links to noindexed pages. Thanks! Best... Darcy
Intermediate & Advanced SEO | | 945010 -
Should I use meta noindex and robots.txt disallow?
Hi, we have an alternate "list view" version of every one of our search results pages The list view has its own URL, indicated by a URL parameter I'm concerned about wasting our crawl budget on all these list view pages, which effectively doubles the amount of pages that need crawling When they were first launched, I had the noindex meta tag be placed on all list view pages, but I'm concerned that they are still being crawled Should I therefore go ahead and also apply a robots.txt disallow on that parameter to ensure that no crawling occurs? Or, will Googlebot/Bingbot also stop crawling that page over time? I assume that noindex still means "crawl"... Thanks 🙂
Intermediate & Advanced SEO | | ntcma0 -
Noindex Valuable duplicate content?
How could duplicate content be valuable and why question no indexing it? My new client has a clever african safari route builder that you can use to plan your safari. The result is 100's of pages that have different routes. Each page inevitably has overlapping content / destination descriptions. see link examples. To the point - I think it is foolish to noindex something like this. But is Google's algo sophisticated enough to not get triggered by something like this? http://isafari.nathab.com/routes/ultimate-tanzania-kenya-uganda-safari-july-november
Intermediate & Advanced SEO | | Rich_Coffman
http://isafari.nathab.com/routes/ultimate-tanzania-kenya-uganda-safari-december-june0 -
Rel=next/prev for paginated pages then no need for "no index, follow"?
I have a real estate website and use rel=next/prev for paginated real estate result pages. I understand "no index, follow" is not needed for the paginated pages. However, my case is a bit unique: this is real estate site where the listings also show on competitors sites. So, I thought, if I "no index, follow" the paginated pages that would reduce the amount of duplicate content on my site and ultimately support my site ranking well. Again, I understand "no index, follow" is not needed for paginated pages when using rel=next/prev, but since my content will probably be considered fairly duplicate, I question if I should do anyway.
Intermediate & Advanced SEO | | khi50 -
Reciprocal Links and nofollow/noindex/robots.txt
Hypothetical Situations: You get a guest post on another blog and it offers a great link back to your website. You want to tell your readers about it, but linking the post will turn that link into a reciprocal link instead of a one way link, which presumably has more value. Should you nofollow your link to the guest post? My intuition here, and the answer that I expect, is that if it's good for users, the link belongs there, and as such there is no trouble with linking to the post. Is this the right way to think about it? Would grey hats agree? You're working for a small local business and you want to explore some reciprocal link opportunities with other companies in your niche using a "links" page you created on your domain. You decide to get sneaky and either noindex your links page, block the links page with robots.txt, or nofollow the links on the page. What is the best practice? My intuition here, and the answer that I expect, is that this would be a sneaky practice, and could lead to bad blood with the people you're exchanging links with. Would these tactics even be effective in turning a reciprocal link into a one-way link if you could overlook the potential immorality of the practice? Would grey hats agree?
Intermediate & Advanced SEO | | AnthonyMangia0