Skip indexing the search pages
-
Hi,
I want all such search pages skipped from indexing
So i have this in robots.txt (Disallow: /search/)
Now any posts that start with search are being blocked and in Google i see this message
A description for this result is not available because of this site's robots.txt – learn more.
How can i handle this and also how can i find all URL's that Google is blocking from showing
Thanks
-
Sure - you have urls that are being blocked by robots - you have this line in your robots.txt -
Disallow: /questions/search
It is thus preventing urls from within that folder, questions, which start with the word search from being crawled. What are you trying to accomplish with this block? If it's the folder search, within questions, it should be /questions/search/.
And the other warning is telling you these pages take a long time to load - check your server or these individual pages and see why that is taking so long.
-
-
As Saijo said above, the meta robots noindex tag is the way to go. When you block a folder via robots.txt, you prevent Google from visiting and crawling that folder and any content within it. If Google has already crawled the content, they won't remove the content from their index just if you block it with robots.txt. The old version they have of the page will be stored and saved in their index, and they just won't be able to show you an updated snippet of the page due to the robots.txt block.
To remove the pages from the index completely, you can do one of 2 things -
- in webmaster tools, go to the url removal section, and remove that folder from the index - this will only work when it's blocked via robots.txt
- you can add a meta robots noindex tag to the pages/page template, and remove the robots.txt block - you need to remove the robots.txt block so the search engines can recrawl the pages, see the meta robots directive, and follow the noindex guide to remove the page.
In general, I would recommend using the meta robots noindex directive over the robots.txt, because it should work for all search engines, and you won't have to go into webmaster tools for each one. You also will ensure that you don't accidentally block other urls.
From your example above, if you just blocked the folder /search/, a page that includes the word search in the url but isn't in the blocked folder shouldn't be blocked from the search engines because of that line - I would check in webmaster tools the robots.txt section, because it doesn't look to me, based on your robots.txt file, that any url with search in it should be blocked.
Good luck,
Mark
-
I guess i was not clear with my question.
So i have this in robots.txt (Disallow: /search/)
My intension yo place /search/ is to stop Google indexing any of my search posts
Now whats happened is
www.somesite.com/questions/search-the-internet
Posts like above are also being blocked
-
To Block search pages from the index you can try adding the META NOINDEX tag in the head section of the search pages
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Removing indexed pages
Hi all, this is my first post so be kind 🙂 - I have a one page Wordpress site that has the Yoast plugin installed. Unfortunately, when I first submitted the site's XML sitemap to the Google Search Console, I didn't check the Yoast settings and it submitted some example files from a theme demo I was using. These got indexed, which is a pain, so now I am trying to remove them. Originally I did a bunch of 301's but that didn't remove them from (at least not after about a month) - so now I have set up 410's - These also seem to not be working and I am wondering if it is because I re-submitted the sitemap with only the index page on it (as it is just a single page site) could that have now stopped Google indexing the original pages to actually see the 410's?
Technical SEO | | Jettynz
Thanks in advance for any suggestions.0 -
Does Google index internal anchors as separate pages?
Hi, Back in September, I added a function that sets an anchor on each subheading (h[2-6]) and creates a Table of content that links to each of those anchors. These anchors did show up in the SERPs as JumpTo Links. Fine. Back then I also changed the canonicals to a slightly different structur and meanwhile there was some massive increase in the number of indexed pages - WAY over the top - which has since been fixed by removing (410) a complete section of the site. However ... there are still ~34.000 pages indexed to what really are more like 4.000 plus (all properly canonicalised). Naturally I am wondering, what google thinks it is indexing. The number is just way of and quite inexplainable. So I was wondering: Does Google save JumpTo links as unique pages? Also, does anybody know any method of actually getting all the pages in the google index? (Not actually existing sites via Screaming Frog etc, but actual pages in the index - all methods I found sadly do not work.) Finally: Does somebody have any other explanation for the incongruency in indexed vs. actual pages? Thanks for your replies! Nico
Technical SEO | | netzkern_AG0 -
Need Help On Proper Steps to Take To De-Index Our Search Results Pages
So, I have finally decided to remove our Search Results pages from Google. This is a big dealio, but our traffic has consistently been declining since 2012 and it's the only thing I can think of. So, the reason they got indexed is back in 2012, we put linked tags on our product pages, but they linked to our search results pages. So, over time we had hundreds of thousands of search results pages indexed. By tag pages I mean: Keywords: Kittens, Doggies, Monkeys, Dog-Monkeys, Kitten-Doggies Each of these would be linked to our search results pages, i.e. http://oursite.com/Search.html?text=Kitten-Doggies So, I really think these pages being indexed are causing much of our traffic problems as there are many more Search Pages indexed than actual product pages. So, my question is... Should I go ahead and remove the links/tags on the product pages first? OR... If I remove those, will Google then not be able to re-crawl all of the search results pages that it has indexed? Or, if those links are gone will it notice that they are gone, and therefore remove the search results pages they were previously pointing to? So, Should I remove the links/tags from the product page (or at least decrease them down to the top 8 or so) as well as add the no-follow no-index to all the Search Results pages at the same time? OR, should I first no-index, no-follow ALL the search results pages and leave those tags on the product pages there to give Google a chance to go back and follow those tags to all of the Search Results pages so that it can get to all of those Search Results pages in order to noindex,. no follow them? Otherwise will Google not be able find these pages? Can someone comment on what might be the best, safest, or fastest route? Thanks so much for any help you might offer me!! Craig So, I wanted to see if you have a suggestion on the best way to handle it? Should I remove the links/tags from the product page (or at least decrease them down to the top 8 or so) as well as add the no-follow no-index to all the Search Results pages at the same time? OR, should I first no-index, no-follow ALL the search results pages and leave those tags on the product pages there to give Google a chance to go back and follow those tags to all of the Search Results pages so that it can get to all of those Search Results pages in order to noindex,. no follow them? Otherwise will Google not be able find these pages? Can you tell me which would be the best, fastest and safest routes?
Technical SEO | | TheCraig0 -
Test site got indexed in Google - What's the best way of getting the pages removed from the SERP's?
Hi Mozzers, I'd like your feedback on the following: the test/development domain where our sitebuilder works on got indexed, despite all warnings and advice. The content on these pages is in active use by our new site. Thus to prevent duplicate content penalties we have put a noindex in our robots.txt. However off course the pages are currently visible in the SERP's. What's the best way of dealing with this? I did not find related questions although I think this is a mistake that is often made. Perhaps the answer will also be relevant for others beside me. Thank you in advance, greetings, Folko
Technical SEO | | Yarden_Uitvaartorganisatie0 -
My beta site (beta.website.com) has been inadvertently indexed. Its cached pages are taking traffic away from our real website (website.com). Should I just "NO INDEX" the entire beta site and if so, what's the best way to do this? Please advise.
My beta site (beta.website.com) has been inadvertently indexed. Its cached pages are taking traffic away from our real website (website.com). Should I just "NO INDEX" the entire beta site and if so, what's the best way to do this? Are there any other precautions I should be taking? Please advise.
Technical SEO | | BVREID0 -
41.000 pages indexed two years after it was redirected to a new domain
Hi!Two years ago, we changed the domain elmundodportivo.es to mundodeportivo.com. Apparently, everything was OK, but more than two years later, there are still 41.000 pages indexed in Google (https://www.google.com/search?q=site%3Aelmundodeportivo.es) even though all the domains have been redirected with a 301 redirect. I detected some problems with redirections that were 303 instead of 301, but we fixed that one month ago.A secondary problem is that the pagerank for elmundodportivo.es is 7 yet and mundodeportivo.com is 3.What I'm doing wrong?Thank you all,Oriol
Technical SEO | | MundoDeportivo0 -
2 links on home page to each category page ..... is page rank being watered down?
I am working on a site that has a home page containing 2 links to each category page. One of the links is a text link and one link is an image link. I think I'm right in thinking that Google will only pay attention to the anchor text/alt text of the first link that it spiders with the anchor text/alt text of the second being ignored. This is not my question however. My question is about the page rank that is passed to each category page..... Because of the double links on the home page, my reckoning is that PR is being divided up twice as many times as necessary. Am I also right in thinking that if Google ignore the 2nd identical link on a page only one lot of this divided up PR will be passed to each category page rather than 2 lots ..... hence horribly watering down the 'link juice' that is being passed to each category page?? Please help me win this argument with a developer and improve the ranking potential of the category pages on the site 🙂
Technical SEO | | QubaSEO0 -
My urls changed with new CMS now search engines see pages as 302s what do I do?
We recently changed our CMS from php to .NET. The old CMS did not allow for folder structure in urls so every url was www.mydomain/name-of-page. In the new CMS we either have to have .aspx at the end of the url or a /. We opted for the /, but now my page rank is dead and Google webmaster tools says my existing links are now going through an intermediary page. Everything resolves to the right place, but looks like spiders see our new pages as being 302 redirected. Example of what's happening. Old page: www.mydomain/name-of-page New page: www.mydomain/name-of-page/ What should I do? Should I go in and 301 redirect the old pages? Will this get cleared up by itself in time?
Technical SEO | | rasiadmin10