Noindex search pages?
-
Is it best to noindex search results pages, exclude them using robots.txt, or both?
-
I think you're possibly trying to solve a problem that you don't have!
As long as you've got a good information architecture and submitting a dynamically updated sitemap then I don't think you need to worry about this. If you're got a blog, then sharing those on Google+ can be a good way to get them quickly indexed.
-
Our search results are not appearing in Google's index and we are not having any issues with getting our content discovered, so I really don't mind disallowing search pages and noindexing them. I was just wondering what advantage there is to disallowing and what I would lose if I only noindex. Isn't it better to allow many avenues of content discovery for the bots?
-
Don't worry. I'm not saying that in your case it'll be a "spider trap". Where I have seen it cause problems was on a site search result page that included a "related searches" and a bunch of technical issues.
Are your search results appearing in Google's index?
If you have a valid reason for allowing spiders to crawl this content then yes. you'll want to just noindex them. Personally I would challenge why you want to do this - is there a bigger problem trying to get search engines to discover new content on your site?
-
Thanks for the response, Doug.
The truth is that it's unlikely that the spiders will find the search results, but if they do why should I consider it a "spider trap"? Even though I don't want the search results pages indexed, I do want the spiders crawling this content. That's why I'm wondering if it's better to just noindex and not disallow in robots.txt?
-
Using the noindex directive will (should) prevent search engines from including the content in their search results - which is good but it still means that the search engines are crawling this content. I've seen one (unlikely) instance where trying to crawl search pages created a bit of a spider trap[, wasting "crawl budget".
So the simplest approach is usually to use the robots.txt to disallow access to the search pages.
If you've got search results in the index already, then you'll want to think about continuing to let Google crawl the pages for a while and using the noindex to help get them de-indexed.
Once this has been done, then you can disallow the site search results in your robots.txt.
Another thing to consider is how the search spiders are finding your search results in the first place...
-
I think it's better to use the robots. With that, you doesn't have problem if someone links to your page.
For better security you can add a meta for this question.
But, as always, it's the spider option to relay on robots, links or metas. If your page it's private, make it private really and put it below a validation system. If you doesn't do it, some "bad" spiders can read and cache your content.
-
No index and blocking robots pretty much do the same thing but you shouldn't only do this if you don't want pages to be not indexed, for more secure areas of the site I would block robots too.
If its to avoid duplicate content don't forget you can use the rel=canonical tag.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Should I use NoIndex on short-lived pages?
Hello, I have a large number of product pages on my site that are relatively short-lived: probably in the region of a million+ pages that are created and then removed within a 24 hour period. Previously these pages were being indexed by Google and did receive landings, but in recent times I've been applying a NoIndex tag to them. I've been doing that as a way of managing our crawl budget but also because the 410 pages that we serve when one of these product pages is gone are quite weak and deliver a relatively poor user experience. We're working to address the quality of those 410 pages but my question is should I be no-indexing these product pages in the first place? Any thoughts or comments would be welcome. Thanks.
Intermediate & Advanced SEO | | PhilipHGray0 -
Magento 1.9 SEO. I have product pages with identical On Page SEO score in the 90's. Some pull up Google page 1 some won't pull up at all. I am searching for the exact title on that page.
I have a website built on Magento 1.9. There are approximately 290,000 part numbers on the site. I am sampling Google SERP results. About 20% of the keywords show up on page 1 position 5 thru 10. 80% don't show up at all. When I do a MOZ page score I get high 80's to 90's. A page score of 89 on one part # may show up on page one, An identical page score on a different part # can't be found on Google. I am searching for the exact part # in the page title. Any thoughts on what may be going on? This seems to me like a Magento SEO issue.
Intermediate & Advanced SEO | | CTOPDS0 -
Page Authority
Hi We have a large number of pages, all sitting within various categories. I am struggling to rank a level 3 for example, or increase authority of this page. Apart from putting it in the main menu or trying to build quality links to it, are there any other methods I can try? We have so many pages I find it hard to workout what the best way to internal link these pages for authority. At the moment they're classified in their relevant categories, but these go from level 1 down to 4 - is this too many classification levels?
Intermediate & Advanced SEO | | BeckyKey1 -
Splitting down pages
Hello everyone, I have a page on my directory for example:
Intermediate & Advanced SEO | | SamBayPublishing
https://ose.directory/topics/breathing-apparatus The title on this page is small yet a bit unspecific:
Breathing Apparatus Companies, Suppliers and Manufacturers On webmaster tools these terms hold different values for each category so "topic name companies" sometimes has a lot more searches than "topic name suppliers". I was thinking if I could split the page into the following into three separate pages would that be better: https://ose.directory/topics/breathing-apparatus (main - Title: Breathing Apparatus)
https://ose.directory/topics/breathing-apparatus/companies (Title: Breathing Apparatus Companies)
https://ose.directory/topics/breathing-apparatus/manufacturers (Title: Breathing Apparatus Manufacturers)
https://ose.directory/topics/breathing-apparatus/suppliers (Title: Breathing Apparatus Suppliers) Two Questions: Would this be more beneficial from an SEO perspective? Would google penalise me for doing this, if so is there a way to do it properly. PS. The list of companies may be the same but the page content ever so slightly different. I know this would not effect my users much because the terms I am using all mean pretty much the same thing. The companies do all three.0 -
Does alt tag optimization benefit search rankings (not image search) at all?
The benefits of alt tag optimization for traditional SEO has always been a "yo yo" subject for me. Way back in the day (2004 to 2007) I believed there was some benefit to alt tag SEO. However as time went on I saw evidence that the major search engines were no longer considering alt tag SEO as a ranking signal. However I later had the pleasure to work on a joint project with a high end SEO firm in 2011/2012. My colleagues fully believed that alt tag optimization was still a very important strategy for traditional SEO at that time. Is there any evidence available that alt tags still help with traditional SEO nowadays? I'm fully aware of the benefits of optimized alt tags and image search. However could optimized alt tags be one of those ranking factors that Google removed due to abuse and later quietly resurrected?
Intermediate & Advanced SEO | | RosemaryB0 -
Should I block temporary pages
I need some SEO advice on an odd scenario: We are launching a new product line (party supplies) on it's own domain (PartySuperCenter.com). Due to some internal/technical reasons we will not be able to launch the site until the summer. We already have the product in our warehouse so the owners want to created a section on our current site (CostumeSuperCenter.com) for the new products. Once the new site is up the product will be removed from our current site and moved to the new site. I am concerned about the effect this will have on our SEO - having thousands of product pages appear and then disappear after a few months. I was thinking about blocking the pages using the "noindex" tag. Is this how you would handle it? Thanks in advance for your help!
Intermediate & Advanced SEO | | costume0 -
Bridge page problem
Hello I run this website, http://www.bestillkredittkort.no (Norwegian website) I'm working all i can to make it rank, so i wanted to test adwords to see how good the page converted.
Intermediate & Advanced SEO | | katal
.
Problem is my page got labeled bridge page. I have read the Google guideline for fixing bridge page and tried to fix what they suggest to make them accept the page, but its not working. I might think there's a underlying problem here, but im not sure how to fix it. I've even seen People from the same niche running adwords campaign with less content on the target page. Last time i tried to recheck if my site would get approved. I ran over the quality score tab. And when it was in pending it showed 10/10 in every aspect. was that just a sample ? I'm realy confused on this one. And im afraid it might be a problem witht the page that can destroy my seo efforts. Anyone have any suggestion or feedback on this one?0 -
Multiple URLs for the same page
I am working with a client and recently discovered that they have several URLs that go to the same page. http://www.maps.com/FunFacts.aspx
Intermediate & Advanced SEO | | WebMarketingandDesign
http://www.maps.com/funfacts.aspx
http://www.maps.com/FunFacts.aspx?nav=FF
http://www.maps.com/FunFacts.aspx?nav=FS
http://www.maps.com/funfacts.aspx?nav=FF
http://www.maps.com/funfacts.aspx?nav=ffhttp://www.maps.com/FunFacts.aspx?nav=MShttp://www.maps.com/funfacts.aspx?nav=
http://www.maps.com/FunFacts.aspx?nav=FF#
http://www.maps.com/FunFacts
http://www.maps.com/funfacts.aspx?.nav=FF I am afraid this is happening all over the site. So, my question is: Is this hurting the SEO and how? If so what is the best way to go about fixing this problem? Thanks for your help!0