What is the best way to stop a page being indexed?
-
What is the best way to stop a page being indexed? Is it to implement robots.txt at a site level with a Robots.txt file in the main directory or at a page level with the tag?
-
Thanks that's good to know!
-
To prevent all robots from indexing a page on your site, place the following meta tag into the section of your page:
To allow other robots to index the page on your site, preventing only a specific search engine bot, for example here Google's robots from indexing the page:
When Google see the noindex meta tag on a page, Google will completely drop the page from our search results, even if other pages link to it. Other search engines, however, may interpret this directive differently. As a result, a link to the page can still appear in their search results.
Note that because Google have to crawl your page in order to see the noindex meta tag, there's a small chance that Googlebot won't see and respect the noindex meta tag. If your page is still appearing in results, it's probably because Google haven't crawled your site since you added the tag. (Also, if you've used your robots.txt file to block this page, Google won't be able to see the tag either.)
If the content is currently in Google's index, it will remove it after the next time it crawl it. To expedite removal, use the Remove URLs tool in Google Webmaster Tools.
-
Thanks that's good to know.
-
"noindex" takes precedents over "index" so basicly if it says "noindex" anywhere google will follow that.
-
Thanks for the answers guys... Can I ask in the event that the Robots.txt file is implemented at the domain level but the mark up on the page is <meta name="robots" content="index, follow"> which one take wins?
-
Why not both? Some cases one method is preferred over another, or in fact necessary. As with non html documents such as pdf, you may have to use the robots.txt to keep it from being indexed or header tags as well. I'll also give you another option, and that is to password protect a directory.
-
Hi,
While the page-level robots meta tag is the best way to stop the page from being indexed, a domain-level robots.txt can save some bandwidth of the search engines. With robots.txt blocking in place, Google will not crawl the page from within the website but can pickup the URLs mentioned some where else on a third-party website. In cases like these, the page-level robots meta tag comes to the rescue. So, it would be best if the pages are blocked using robots.txt file as well as the page-level meta robots tag. Hope that helps.
Good luck friend.
Best regards,
Devanur Rafi
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Escort directory page indexing issues
Re; escortdirectory-uk.com, escortdirectory-usa.com, escortdirectory-oz.com.au,
Technical SEO | | ZuricoDrexia
Hi, We are an escort directory with 10 years history. We have multiple locations within the following countries, UK, USA, AUS. Although many of our locations (towns and cities) index on page one of Google, just as many do not. Can anyone give us a clue as to why this may be?0 -
Get List Of All Indexed Google Pages
I know how to run site:domain.com but I am looking for software that will put these results into a list and return server status (200, 404, etc). Anyone have any tips?
Technical SEO | | InfinityTechnologySolutions0 -
What is the best way to redirect visitors to certain pages of your site based on their location?
One website I manage wants to redirect users to state specific pages based on their location. What is the best way to accomplish this? For example a user enters the through site.com but they are in Colorado so we want to direct them to site.com/colorado.
Technical SEO | | Firestarter-SEO0 -
How to stop google from indexing specific sections of a page?
I'm currently trying to find a way to stop googlebot from indexing specific areas of a page, long ago Yahoo search created this tag class=”robots-nocontent” and I'm trying to see if there is a similar manner for google or if they have adopted the same tag? Any help would be much appreciated.
Technical SEO | | Iamfaramon0 -
Home page not indexed by any search engines
We are currently having an issue with our homepage not being indexed by any search engines. We recently transferred our domain to Godaddy and there was an issue with the DNS. When we typed our url into Google like this "https://www.mysite.com" nothing from the site came up in the search results, only our social media profiles. When we typed our url into Google like this "mysite.com" we were sent to a GoDaddy parked page. We've been able to fix the issue over at Godaddy and the url "mysite.com" is not being redirected to "https://mysite.com" but, Google and the other search engines have yet to respond. I would say our fix has been in place for at least 72 hours. Do I need to give this more time? I would think that at lease one search engine would have picked up on the change by now and would start indexing the site properly.
Technical SEO | | bcglf1 -
Is there any value to a home page URL adding the /index.html ?
For proper SEO, which version would you prefer? A. www.abccompany.com B. www.abccompany.com/index.html Is there any value or difference with either home page URL??
Technical SEO | | theideapeople0 -
What is the best way to change your sites folder structure?
Hi, Our site was originally created with a very flat folder structure - most of the pages are at the top level. Because we will adding more content I want to tidy up the structure first. I just wanted to check what the best way to go about this was. Is it best to: First configure all the new 301 redirects to point to the new pages, while leaving the actual links on our site pointing to the old pages. Then changing the links on the site after a few weeks. Configure the redirects and change the actual links on my website at the same time to point to the new locations. My thinking that if I go with option 1 route then I will give Google a chance to process all the redirects and change the locations in their index before I start pointing them to the new locations. But does it make any difference? What is the best wat to go about making this sort of change to minimize any loss in rankings, page rank etc? Thanks for the help.
Technical SEO | | Maximise0 -
Importance of an optimized home page (index)
I'm helping a client redesign their website and they want to have a home page that's primarily graphics and/or flash (or jquery). If they are able to optimize all of their key sub-pages, what is the harm in terms of SEO?
Technical SEO | | EricVallee340