What is the best way to stop a page being indexed?
-
What is the best way to stop a page being indexed? Is it to implement robots.txt at a site level with a Robots.txt file in the main directory or at a page level with the tag?
-
Thanks that's good to know!
-
To prevent all robots from indexing a page on your site, place the following meta tag into the section of your page:
To allow other robots to index the page on your site, preventing only a specific search engine bot, for example here Google's robots from indexing the page:
When Google see the noindex meta tag on a page, Google will completely drop the page from our search results, even if other pages link to it. Other search engines, however, may interpret this directive differently. As a result, a link to the page can still appear in their search results.
Note that because Google have to crawl your page in order to see the noindex meta tag, there's a small chance that Googlebot won't see and respect the noindex meta tag. If your page is still appearing in results, it's probably because Google haven't crawled your site since you added the tag. (Also, if you've used your robots.txt file to block this page, Google won't be able to see the tag either.)
If the content is currently in Google's index, it will remove it after the next time it crawl it. To expedite removal, use the Remove URLs tool in Google Webmaster Tools.
-
Thanks that's good to know.
-
"noindex" takes precedents over "index" so basicly if it says "noindex" anywhere google will follow that.
-
Thanks for the answers guys... Can I ask in the event that the Robots.txt file is implemented at the domain level but the mark up on the page is <meta name="robots" content="index, follow"> which one take wins?
-
Why not both? Some cases one method is preferred over another, or in fact necessary. As with non html documents such as pdf, you may have to use the robots.txt to keep it from being indexed or header tags as well. I'll also give you another option, and that is to password protect a directory.
-
Hi,
While the page-level robots meta tag is the best way to stop the page from being indexed, a domain-level robots.txt can save some bandwidth of the search engines. With robots.txt blocking in place, Google will not crawl the page from within the website but can pickup the URLs mentioned some where else on a third-party website. In cases like these, the page-level robots meta tag comes to the rescue. So, it would be best if the pages are blocked using robots.txt file as well as the page-level meta robots tag. Hope that helps.
Good luck friend.
Best regards,
Devanur Rafi
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Escort directory page indexing issues
Re; escortdirectory-uk.com, escortdirectory-usa.com, escortdirectory-oz.com.au,
Technical SEO | | ZuricoDrexia
Hi, We are an escort directory with 10 years history. We have multiple locations within the following countries, UK, USA, AUS. Although many of our locations (towns and cities) index on page one of Google, just as many do not. Can anyone give us a clue as to why this may be?0 -
URL with query string being indexed over it's parent page?
I noticed earlier this week that this page - https://www.ihasco.co.uk/courses/detail/bomb-threats-and-suspicious-packages?channel=care was being indexed instead of this page - https://www.ihasco.co.uk/courses/detail/bomb-threats-and-suspicious-packages for its various keywords We have rel=canonical tags correctly set up and all internal links to these pages with query strings are nofollow, so why is this page being indexed? Any help would be appreciated 🙂
Technical SEO | | iHasco0 -
What is the best way of inserting keywords into a social networking site
We have a social networking site called Yookos (www.yookos.com). As per the norm, many of the pages of the site carry user generated content and can only be accessed by the users themselves. i created a keyword list that our technical person the uploaded onto the site but SEOmoz shows that these keywords cannot be picked up.
Technical SEO | | seoworx123
What is the best way to make sure that our site is also picked up via search engine searches.0 -
Home page indexed but not ranking...interior pages with thin content outrank home page??
I have a Joomla site with a home page that I can't get to rank for anything beyond the company name @ Google - the site works fine @ Bing and Yahoo. The interior pages will rank all day long but the home page never shows up in the results. I have checked the page code out in every tool that I know about and have had no luck....by all account it should be good to go...any thoughts/comments/help would be greatly appreciated. The site is http://www.selectivedesigns.com Thanks! Greg
Technical SEO | | DougHosmer0 -
Huge number of indexed pages with no content
Hi, We have accidentally had Google indexed lots os our pages with no useful content at all on them. The site in question is a directory site, where we have tags and we have cities. Some cities have suppliers for almost all the tags, but there are lots of cities, where we have suppliers for only a handful of tags. The problem occured, when we created a page for each cities, where we list the tags as links. Unfortunately, our programmer listed all the tags, so not only the ones, where we have businesses, offering their services, but all of them! We have 3,142 cities and 542 tags. I guess, that you can imagine the problem this caused! Now I know, that Google might simply ignore these empty pages and not crawl them again, but when I check a city (city site:domain) with only 40 providers, I still have 1,050 pages indexed. (Yes, we have some issues between the 550 and the 1050 as well, but first things first:)) These pages might not be crawled again, but will be clicked, and bounces and the whole user experience in itself will be terrible. My idea is, that I might use meta noindex for all of these empty pages and perhaps also have a 301 redirect from all the empty category pages, directly to the main page of the given city. Can this work the way I imagine? Any better solution to cut this really bad nightmare short? Thank you in advance. Andras
Technical SEO | | Dilbak0 -
Best practice for rich snippet product data - which page shows up?
We have a website with thousands of pages that rank locally for a specific service we offer. What I'd like to do is add rich snippets to these pages. I'd like to setup the services we offer as 'products' in the rich snippets, so that our 2 services show up below the url as rich snippets. I guess I'm not sure if the markup is supposed to be on the product page itself, or if I should use the offerurl tag, to create a separate page on the site whose only purpose is to have a long list of the services we offer pointing to the local pages as the offer url's. What do I do with this page? what are best practices for this offer aggregator? Are there any resources I can look at? Am I even doing this right? I'm new to having markup pages, and I'm hoping that the markup code doesn't actually need to be on the product offer page itself, but that the product offer page is the one that shows up on the results - that is my last question actually - which page will show up? the offerurl link, or the actual markup page.
Technical SEO | | ilyaelbert0 -
Secondary Pages Indexed over Primary Page
I have 4 pages for a single product Each of the pages link to the Main page for that product Google is indexing the secondary pages above my preferred landing page How do I fix this?
Technical SEO | | Bucky0 -
Best way to use affiliate links
What is the best practice to use amazon affliate links in blog posts? I have read different opinions on this, and want to be sure I'm using best practices. I sometimes link to amazon with an affiliate link on some of my posts, and am working on a top ten Christmas gift ideas for Children born with Down syndrome with lots of affiliate links on it. I'm want to be sure I'm using best practices when adding links like this. Tanks!
Technical SEO | | NoahsDad0