Effect duration of robots.txt file.
-
in my web site there is demo site in that also, index in Google but no need it now.so i have created robots file and upload to server yesterday.in the demo folder there are some html files,and i wanna remove all these in demo file from Google.but still in web master tools it showing
User-agent: *
Disallow: /demo/How long this will take to remove from Google ?
And are there any alternative way doing that ?
-
Google Webmaster Tools also has a remove URL function where you can remove an entire directory, which may be of help to you.
-
And, if they are already indexed, you have to wait for them to be recrawled, then fall out of index, so it's not an immediate thing. Sometimes it takes days, sometimes weeks.
-
Hello,
The robots directive will only prevent google from crawling the pages. In order t remove the pages from index you need to add "meta noindex" to the pages you want to have removed.
<meta name="robots" content="noindex">
http://support.google.com/webmasters/bin/answer.py?hl=en&answer=93710
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Block subdomain directory in robots.txt
Instead of block an entire sub-domain (fr.sitegeek.com) with robots.txt, we like to block one directory (fr.sitegeek.com/blog).
Intermediate & Advanced SEO | | gamesecure
'fr.sitegeek.com/blog' and 'wwww.sitegeek.com/blog' contain the same articles in one language only labels are changed for 'fr' version and we suppose that duplicate content cause problem for SEO. We would like to crawl and index 'www.sitegee.com/blog' articles not 'fr.sitegeek.com/blog'. so, suggest us how to block single sub-domain directory (fr.sitegeek.com/blog) with robot.txt? This is only for blog directory of 'fr' version even all other directories or pages would be crawled and indexed for 'fr' version. Thanks,
Rajiv0 -
Bing Disavow file
Hi I have just set up Bing Webmaster tools, and wanted to submit my disavow file. However I can only work out how to add one link at a time, does anyone know how to add a csv file. Thanks in advance. Andy
Intermediate & Advanced SEO | | Andy-Halliday0 -
Effects of pages heavily reliant on CSS for text and image content
We have a new feature that's been live for a couple days here: http://www.imaging-resource.com/cameras/canon/t5/vs/canon/60d/ My concern is that the developer relied very heavily on css for content and image layout. Such that the meat of our pages looks pretty meager: https://gist.github.com/anonymous/b1ccb77914c6722d40bd Google does parse css, but I'm not sure if it does so for content, or just to verify the site isn't doing something nefarious. Will google see our deeper content in the css, or view the page as being very thin?
Intermediate & Advanced SEO | | ir-seo-account0 -
What should I block with a robots.txt file?
Hi Mozzers, We're having a hard time getting our site indexed, and I have a feeling my dev team may be blocking too much of our site via our robots.txt file. They say they have disallowed php and smarty files. Is there any harm in allowing these pages? Thanks!
Intermediate & Advanced SEO | | Travis-W1 -
Could you use a robots.txt file to disalow a duplicate content page from being crawled?
A website has duplicate content pages to make it easier for users to find the information from a couple spots in the site navigation. Site owner would like to keep it this way without hurting SEO. I've thought of using the robots.txt file to disallow search engines from crawling one of the pages. Would you think this is a workable/acceptable solution?
Intermediate & Advanced SEO | | gregelwell0 -
What is better for SEO - local video file or youtube video?
Should I use a video player and upload the videos for my website or should I put my videos at youtube and use youtube player?
Intermediate & Advanced SEO | | Naghirniac0 -
Block all but one URL in a directory using robots.txt?
Is it possible to block all but one URL with robots.txt? for example domain.com/subfolder/example.html, if we block the /subfolder/ directory we want all URLs except for the exact match url domain.com/subfolder to be blocked.
Intermediate & Advanced SEO | | nicole.healthline0 -
XML Sitemap instruction in robots.txt = Worth doing?
Hi fellow SEO's, Just a quick one, I was reading a few guides on Bing Webmaster tools and found that you can use the robots.txt file to point crawlers/bots to your XML sitemap (they don't look for it by default). I was just wondering if it would be worth creating a robots.txt file purely for the purpose of pointing bots to the XML sitemap? I've submitted it manually to Google and Bing webmaster tools but I was thinking more for the other bots (I.e. Mozbot, the SEOmoz bot?). Any thoughts would be appreciated! 🙂 Regards, Ash
Intermediate & Advanced SEO | | AshSEO20110