Correct use for Robots.txt
-
I'm in the process of building a website and am experimenting with some new pages. I don't want search engines to begin crawling the site yet. I would like to add the Robot.txt on my pages that I don't want them to crawl. If I do this, can I remove it later and get them to crawl those pages?
-
Lewis,
Thank you for the clarification!
-
Hi Eric
The guidance above means that Google when it looks to crawl your site won't its not a message to Google telling it never to come back.
Once everything is sorted, remove whichever approach you took to block the search engines and supply a sitemap to Google via the Webmaster tools. Your site should be crawled in no time after that.
Hope this helps.
-
Damian,
Thanks for your answer, that helps. If I add either one of the above items to my web page, and then remove it at a later date, will the search engines crawl and rank my site (at sometime after they are removed)? In other words, and I know this sounds stupid, but does a search engine see a Robots.txt file and never visit it again?
-
Hey Eric,
If you want to create and work on pages but you don't want them indexed you can add the following to the page in the section (the pages will still be crawled):
If you want NONE of your pages to be crawled (I.E the whole website) you can add the following to your robots.txt file:
User-agent: * Disallow: /
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What Website Platform would you use?
We are considering 1 maybe 3 websites (1 new 2 redesign) there are so many options with technology rolling over every 3 years it seems. Here are my obstacles: Wordpress: dynamic but easily hacked, often needs updating Joomla: attractive, but cumbersome to manage Custom: sleek and easier to manage, but gets outdated quickly Here are some variables: 80+ products, E-Commerce, Dynamic, Blog, Tons of images, SEO friendly, MUST not lose our rankings! OK, Moz friends, What do you recommend?
Web Design | | KevnJr0 -
Why use a wwwP subdomain naming convention
While working through a series of crawl reports and competitive insights for a site, I noticed one of the competitors had switched from a WWW-version to a wwwP-version. Looking back at the snapshot I took of this during the same time period in 2014, I noticed a significant drop in PA/DA by 20+/-. I'm curious to know if anybody else has experienced something similar, and if anybody can provide insights on why a change like this would even be made? I'll preface it with, everything we could see that this competitor was doing from the outside, was legitimate and propelling them in a positive direction.
Web Design | | dodgejd0 -
How to command Robots.txt to this:
Hi, So for some reason I have this unexplained issues in webmaster tools. Check them out: http://prntscr.com/7n1nj8 See that iSeeCars.com? How to remove it? Is it just disallow: iseecars.com? Or should I disallow the search to be crawled? Regards,
Web Design | | Kokolo0 -
Should i not use hyphens in web page titles? Google Penalty for hyphens?
all the page titles in my site have hyphens between the words like this: http://texas.com/texas-plumbers.html I have seen tests where hyphenated domain names ranked lower than non hyphenated domain names. Does this mean my pages are being penalized for hyphens or is this only in the domain that it is penalized? If I create new pages should I not use hyphens in the page titles when there are two or more words in the title? If I changed all my page titles to eliminate the hyphens, I would lose all my rankings correct? My site is 12 years old and if I changed all these titles I'm guessing that each page would be thrown in the google sandbox for several months, is this true? Thanks mozzers!
Web Design | | Ron100 -
Does IE9 use doucument mode IE9 Standards or Quirks???
Trying to set up the styling on a new Shopping Cart, and having trouble with Internet Explorer (IE) When I view it in IE, I click on F12 to identify the code viewer (like FireBug) and it lets me choose to view it using the browser IE9 or earlier. So far, so good... But right next to the browser selector is the "Document Mode" selector and sometimes it says "Quirks Mode" and sometimes it says "IE9 Standards" (when using IE9 browser) - and parts of the page styling goes crazy when I switch back and forth from Quirks to Standards. Help! Which does IE9 and IE8 actually use? Does it have to work for both? Is there a way to force it to one or the other? Any guidance or tips is appreciated. Thanks, Greg
Web Design | | GregB1230 -
Image Replacement Using Cufon (Javascript)
Our agency is working with an outside developer that has designed a beautiful site. The possible problem is that they used Cufon to change a large amount of the text on the page to an image of the text in a nicer font. On some pages all of the text is replaced and on others its about 20%. The text that is replaced is identical to what is shown to the user. I realize that Google has stated that sIFR (similar to Cufon) is okay, in a limited way years ago, but I am stil a little leery of the large amount of image replacement that is happening. I am also worried about user experience, should flash not be enabled or it is slower to load. So I have a couple questions. 1. Would this amount of image replacment raise a flag to Google, especially since it is the heading tags and large chunks of the body content both? 2. I know about 2% of the site's users do not have javascript enabled. Do you have an idea of what percentage of people have issues, like slow connection speeds or slow computers, using javascript even if it is enabled?
Web Design | | DirectiveGroup0 -
The use of foreign characters and capital letters in URL's?
Hello all, We have 4 language domains for our website, and a number of our Spanish landing pages are written using Spanish characters - most notably: ñ and ó. We have done our research around the web and realised that many of the top competitors for keywords such as Diseño Web (web design) and Aplicaión iPhone (iphone application) DO NOT use these special chacracters in their URL structure. Here is an example of our URL's EX: http://www.twago.es/expert/Diseño-Web/Diseño-Web However when I simply copy paste a URL that contains a special character it is automatically translated and encoded. EX: http://www.twago.es/expert/Aplicación-iPhone/Aplicación-iPhone (When written out long had it appears: http://www.twago.es/expert/Aplicación-iPhone/Aplicación-iPhone My first question is, seeing how the overwhelming majority of website URL's DO NOT contain special characters (and even for Spanish/German characters these are simply written using the standard English latin alphabet) is there a negative effect on our SEO rankings/efforts because we are using special characters? When we write anchor text for backlinks to these pages we USE the special characteristics in the anchor text (so does most other competitors). Does the anchor text have to exactly I know most webbrowsers can understand the special characters, especially when returning search results to users that either type the special characters within their search query (or not). But we seem to think that if we were doing the right thing, then why does everyone else do it differently? My second question is the same, but focusing on the use of Capital letters in our URL structure. NOTE: When we do a broken link check with some link tools (such as xenu) the URL's that contain the special characters in Spanish are marked as "broken". Is this a related issue? Any help anyone could give us would be greatly appreciated! Thanks, David from twago
Web Design | | wdziedzic0 -
Using "#" anchors to display different content
If I have a page that has an area on the page that acts like a widget and has three different tabs. These tabs provide 3 different types of information relevant to the page subject matter. By default when someone goes to the page one of the tabs is showing but you have to click on the others to see the info on them. Is it OK to use domain.com/topic#TAB1, domain.com/topic#TAB2, domain.com/topic#TAB3 to create shortcut links so that people can land on the page and have that predetermined tab showing. I'm wondering what search engines might think. Essentially all the content of all three tabs is there for people to see but they'd have to click to see the other tabs. I don't consider the content to be hidden. But I'd like to hear people's thoughts.
Web Design | | Business.com0