How to remove my site's pages in search results?
-
I have tested hundreds of pages to see if Google will properly crawl, index and cached them. Now, I want these pages to be removed in Google search except for homepage. What should be the rule in robots.txt?
I use this rule, but I am not sure if Google will remove the hundreds of pages (for my testing).
User-agent: *
Disallow: /
Allow: /$ -
Why not just 404/410 those pages?
-
Hi Matt! I've already tried your suggestion. I'll let you know what's the result. Thanks a lot man!
-
why don't you try adding a meta robots tag on those pages with "NOINDEX".
i would also do remove url with WMT
-
These are just test pages and I need them to be private and not visible in Google after I test. I understand that there will be a drop in SERP rankings.
-
I would do
User-agent: *
Disallow: /?
Allow: /But test it first in WMT first to be safe. However, you must be sure that this is the route you want to go down. Robots.txt will prevent all of those pages from being indexed, which means that none of their content will count. Any links to these pages may also be devalued. The result is a potential drop in SERPs.
What is the reason why you don't want them appearing? That way we may find an alternative solution.
-
This is basically a duplicate of your other thread where I gave you that code. Yes, it should block the other pages. Put that in, fetch in WMT and you should be right.
Also, you can test it before you implement in WMT as well. I tried it on my end and it works.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Competing Pages on Ecommerce Site - Very Frustrating
We have multiple issues with this situation. We rank #1 for "Lace Fabric", #3 for "Lace Trim", and #80 for "Lace". We also rank for "Lace Ribbon", and "Lace Appliques". The Lace Fabric and Lace Trim pages have plenty of backlinks, wherein may lie the problem. We have a similar issue for "Satin". "Silk Satin", "Polyester Satin", "Satin Trim", "Satin Ribbon", etc. This is a very annoying and common pattern. Our backlink profile is sterling, and our competitors with inferior backlink profiles and branded search are outranking us. We outrank them across the board for 2 word terms. Based on my evaluation of TF/CF, PA/DA, Content, etc., we should be on page 1 for "Lace". IMHO, these pages are competing for the head term. Any ideas on how to eliminate this issue to rank for head terms?
Intermediate & Advanced SEO | | GWMSEO0 -
On-site Search - Revisited (again, *zZz*)
Howdy Moz fans! Okay so there's a mountain of information out there on the webernet about internal search results... but i'm finding some contradiction and a lot of pre-2014 stuff. Id like to hear some 2016 opinion and specifically around a couple of thoughts of my own, as well as some i've deduced from other sources. For clarity, I work on a large retail site with over 4 million products (product pages), and my predicament is thus - I want Google to be able to find and rank my product pages. Yes, I can link to a number of the best ones by creating well planned links via categorisation, silos, efficient menus etc (done), but can I utilise site search for this purpose? It was my understanding that Google bots don't/can't/won't use a search function... how could it? It's like expeciting it to find your members only area, it can't login! How can it find and index the millions of combinations of search results without typing in "XXXXL underpants" and all the other search combinations? Do I really need to robots.txt my search query parameter? How/why/when would googlebot generate that query parameter? Site Search is B.A.D - I read this everywhere I go, but is it really? I've read - "It eats up all your search quota", "search results have no content and are classed as spam", "results pages have no value" I want to find a positive SEO output to having a search function on my website, not just try and stifle Mr Googlebot. What I am trying to learn here is what the options are, and what are their outcomes? So far I have - _Robots.txt - _Remove the search pages from Google _No Index - _Allow the crawl but don't index the search pages. _No Follow - _I'm not sure this is even a valid idea, but I picked it up somewhere out there. _Just leave it alone - _Some of your search results might get ranked and bring traffic in. It appears that each and every option has it's positive and negative connotations. It'd be great to hear from this here community on their experiences in this practice.
Intermediate & Advanced SEO | | Mark_Elton0 -
Removing pages from index
My client is running 4 websites on ModX CMS and using the same database for all the sites. Roger has discovered that one of the sites has 2050 302 redirects pointing to the clients other sites. The Sitemap for the site in question includes 860 pages. Google Webmaster Tools has indexed 540 pages. Roger has discovered 5200 pages and a Site: query of Google reveals 7200 pages. Diving into the SERP results many of the pages indexed are pointing to the other 3 sites. I believe there is a configuration problem with the site because the other sites when crawled do not have a huge volume of redirects. My concern is how can we remove from Google's index the 2050 pages that are redirecting to the other sites via a 302 redirect?
Intermediate & Advanced SEO | | tinbum0 -
My website (non-adult) is not appearing in Google search results when i have safe search settings on. How can i fix this?
Hi, I have this issue where my website does not appear in Google search results when i have the safe search settings on. If i turn the safe search settings off, my site appears no problem. I'm guessing Google is categorizing my website as adult, which it definitely is not. Has anyone had this issue before? Or does anyone know how to resolve this issue? Any help would be much appreciated. Thanks
Intermediate & Advanced SEO | | CupidTeam0 -
My homepage doesn't rank anymore. It's been replaced by irrelevant subpages which rank around 100-200 instead of top 5.
Hey guys, I think I got some kind of penalty for my homepage. I was in top5 for my keywords. Then a few days ago, my homepage stopped ranking for anything except searching for my domain name in Google. sitename.com/widget-reviews/ previously ranked #3 for "widget reviews"
Intermediate & Advanced SEO | | wearetribe
but now....
sitename.com/widget-training-for-pet-cats/ is ranking #84 for widget reviews instead. Similarly across all my other keywords, irrelevant, wrong pages are ranking. Did I get some kind of penalty?0 -
Most Painless way of getting Duff Pages out of SE's Index
Hi, I've had a few issues that have been caused by our developers on our website. Basically we have a pretty complex method of automatically generating URL's and web pages on our website, and they have stuffed up the URL's at some point and managed to get 10's of thousands of duff URL's and pages indexed by the search engines. I've now got to get these pages out of the SE's indexes as painlessly as possible as I think they are causing a Panda penalty. All these URL's have an addition directory level in them called "home" which should not be there, so I have: www.mysite.com/home/page123 instead of the correct URL www.mysite.com/page123 All these are totally duff URL's with no links going to them, so I'm gaining nothing by 301 redirects, so I was wondering if there was a more painless less risky way of getting them all out the indexes (IE after the stuff up by our developers in the first place I'm wary of letting them loose on 301 redirects incase they cause another issue!) Thanks
Intermediate & Advanced SEO | | James770 -
301 a page and then remove the 301
I have a real estate website that has a city hub page. All the homes for sale within a city are linked to from this hub page. Certain small cities may have one home on the market for a month and then not have any homes on the market for months or years. I call them "Ghost Cities". This problem happens across many cities at any point in time. The resulting city hub pages are left with little to no content. We are throwing around the idea of 301 redirecting these "Ghost City" pages to a page higher up in the hierarchy (Think state or county) until we get new homes for sale in the city. At that point we would remove the 301. Any thoughts on this strategy? Is it bad to turn 301s on and off like that? Thanks!
Intermediate & Advanced SEO | | ChrisKolmar0 -
Should the sitemap include just menu pages or all pages site wide?
I have a Drupal site that utilizes Solr, with 10 menu pages and about 4,000 pages of content. Redoing a few things and we'll need to revamp the sitemap. Typically I'd jam all pages into a single sitemap and that's it, but post-Panda, should I do anything different?
Intermediate & Advanced SEO | | EricPacifico0