Noindex search pages?
-
Is it best to noindex search results pages, exclude them using robots.txt, or both?
-
I think you're possibly trying to solve a problem that you don't have!
As long as you've got a good information architecture and submitting a dynamically updated sitemap then I don't think you need to worry about this. If you're got a blog, then sharing those on Google+ can be a good way to get them quickly indexed.
-
Our search results are not appearing in Google's index and we are not having any issues with getting our content discovered, so I really don't mind disallowing search pages and noindexing them. I was just wondering what advantage there is to disallowing and what I would lose if I only noindex. Isn't it better to allow many avenues of content discovery for the bots?
-
Don't worry. I'm not saying that in your case it'll be a "spider trap". Where I have seen it cause problems was on a site search result page that included a "related searches" and a bunch of technical issues.
Are your search results appearing in Google's index?
If you have a valid reason for allowing spiders to crawl this content then yes. you'll want to just noindex them. Personally I would challenge why you want to do this - is there a bigger problem trying to get search engines to discover new content on your site?
-
Thanks for the response, Doug.
The truth is that it's unlikely that the spiders will find the search results, but if they do why should I consider it a "spider trap"? Even though I don't want the search results pages indexed, I do want the spiders crawling this content. That's why I'm wondering if it's better to just noindex and not disallow in robots.txt?
-
Using the noindex directive will (should) prevent search engines from including the content in their search results - which is good but it still means that the search engines are crawling this content. I've seen one (unlikely) instance where trying to crawl search pages created a bit of a spider trap[, wasting "crawl budget".
So the simplest approach is usually to use the robots.txt to disallow access to the search pages.
If you've got search results in the index already, then you'll want to think about continuing to let Google crawl the pages for a while and using the noindex to help get them de-indexed.
Once this has been done, then you can disallow the site search results in your robots.txt.
Another thing to consider is how the search spiders are finding your search results in the first place...
-
I think it's better to use the robots. With that, you doesn't have problem if someone links to your page.
For better security you can add a meta for this question.
But, as always, it's the spider option to relay on robots, links or metas. If your page it's private, make it private really and put it below a validation system. If you doesn't do it, some "bad" spiders can read and cache your content.
-
No index and blocking robots pretty much do the same thing but you shouldn't only do this if you don't want pages to be not indexed, for more secure areas of the site I would block robots too.
If its to avoid duplicate content don't forget you can use the rel=canonical tag.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Should remove 404 page
Hello, I upload a new website with new web addresses and my current addresses don't work anymore. I don't want to do redirects. Should I just remove the old address from google index using their tool or let google do it on its own. Thank you,
Intermediate & Advanced SEO | | seoanalytics1 -
Ranking on google search
Hello Mozzers Moz On page grader shows A grade for the particular URL,but my page was not ranking on top 100 Google search. Any help is appreciated ,Thanks
Intermediate & Advanced SEO | | sobanadevi0 -
I have removed over 2000+ pages but Google still says i have 3000+ pages indexed
Good Afternoon, I run a office equipment website called top4office.co.uk. My predecessor decided that he would make an exact copy of the content on our existing site top4office.com and place it on the top4office.co.uk domain which included over 2k of thin pages. Since coming in i have hired a copywriter who has rewritten all the important content and I have removed over 2k pages of thin pages. I have set up 301's and blocked the thin pages using robots.txt and then used Google's removal tool to remove the pages from the index which was successfully done. But, although they were removed and can now longer be found in Google, when i use site:top4office.co.uk i still have over 3k of indexed pages (Originally i had 3700). Does anyone have any ideas why this is happening and more importantly how i can fix it? Our ranking on this site is woeful in comparison to what it was in 2011. I have a deadline and was wondering how quickly, in your opinion, do you think all these changes will impact my SERPs rankings? Look forward to your responses!
Intermediate & Advanced SEO | | apogeecorp0 -
Wrong page being ranked
Hi there, This seems a bit of a strange one, I have a particular keyword which I am trying to rank for, all internal links with the appropriate anchor text are pointing to the page I want to rank for, for this particular keyword, all external links are pointing to the page I want to rank for, for this particular keyword, however Google is ranking another page on my website for this keyword and the bizarre things is the page which is being ranked is a .PDF I am really not sure what else to do to give Google the hint that they are ranking the wrong page, any ideas? Kind Regards
Intermediate & Advanced SEO | | Paul780 -
Landing Page - Home Page redesign SEO factor question - Serious concern.
Hi Folks, I'm considering making a big change to our website and really need some expert advise. Will we lose ranking if we do what I propose? Our site www.meninkilts.com, needs to split users/clients by "Commercial" and "Residential" so we can message/market completely differently to each client. We are considering doing this structure: Landing Page | | Commercial Homepage Residential Homepage Right now we rank well, for our keywords like "Window Cleaning cityname" but are worried that adding a landing page, and splitting our site to two homepages will effect seo (ie: a landing page would only have two buttons: one for commercial and one for residential). What would be the best way to handle this. Looking forward to your insights! Cheers Brent
Intermediate & Advanced SEO | | MenInKilts0 -
Not sure why Home page is outranked by less optimized internal pages.
We launched our website just three weeks ago, and one of our primary keyword phrases is "e-business consultants". Here's what I don't get. Our home page is the page most optimized around this search phrase. Using SEOmoz On-Page Optimization tool, the home page scores an "A". And yet it doesn't rank in the top 50 on Google Canada, although two other INTERNAL pages - www.ebusinessconsultants.ca/about/consulting-team/ & /www.ebusinessconsultants.ca/about/consulting-approach/ - rank 5 & 6 on Google Canada, even though they only score a grade "C" for on-page optimization for this keyword phrase. I've always understood that the home page is the most powerful page. Why are these others outranking it? I checked the crawl and Google Webmaster, and there is no obvious problem on the home page. Is this because the site is so new? It goes against all previous experience I've had in similar situation. Any guidance/ insight would be highly appreciated!!
Intermediate & Advanced SEO | | axelk0 -
1 of the sites i work on keeps having its home page "de-indexed" by google every few months, I then apply for a review and they put it back up. But i have no idea why this keeps happening and its only the home page
1 of the sites i work on (www.eva-alexander.com) keeps having its home page "de-indexed" by google every few months, I then apply for a review and they put it back up. But i have no idea why this keeps happening and its only the home page I have no idea why and have never experienced this before
Intermediate & Advanced SEO | | GMD10 -
High number of items per page or low number with more category pages?
In SEO terms, what would be the best method: High number of items per page or low number with more pages? For example, this category listing here: http://flyawaysimulation.com/downloads/90/fsx-civil-aircraft/ It has 10 items per page. Would there be any benefit of changing a listing like that to 20 items in order to decrease the number of pages in the category? Also, what other ways could you increase the SEO of category listings like that?
Intermediate & Advanced SEO | | Peter2640