Effect duration of robots.txt file.
-
in my web site there is demo site in that also, index in Google but no need it now.so i have created robots file and upload to server yesterday.in the demo folder there are some html files,and i wanna remove all these in demo file from Google.but still in web master tools it showing
User-agent: *
Disallow: /demo/How long this will take to remove from Google ?
And are there any alternative way doing that ?
-
Google Webmaster Tools also has a remove URL function where you can remove an entire directory, which may be of help to you.
-
And, if they are already indexed, you have to wait for them to be recrawled, then fall out of index, so it's not an immediate thing. Sometimes it takes days, sometimes weeks.
-
Hello,
The robots directive will only prevent google from crawling the pages. In order t remove the pages from index you need to add "meta noindex" to the pages you want to have removed.
<meta name="robots" content="noindex">
http://support.google.com/webmasters/bin/answer.py?hl=en&answer=93710
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
No Control Over Subdomains - What Will the Effect Be?
Hello all, I work for a university and I my small team is responsible for the digital marketing, website, etc. We recently had a big initiative on SEO and generating traffic to our website. The issue I am having is that my department only "owns" the www subdomain. There are lots of other subdomains out there. For example, a specific department can have its own subdomain at department.domain.com and students can have their own webpage at students.domain.com, etc. I know the possibilities of domain cannibilization, but has any one run into long term problems with a similar situation or had success in altering the views of a large organization? If I do get the opportunity to help some of these other domains, what is best to help our overall domain authority? Should the focus be on removing similar content to the www subdomain or cleaning up errors? Some of these subdomains have hundreds of 4XX errors.
Intermediate & Advanced SEO | | Jeff_Bender0 -
Title Tags in Sitecore are the same as navigation. How do I add keyword phrases without effecting my website's navigation?
I am working on overhauling the on-page SEO for ecommerce website on Sitecore. I've done all my research and I am ready to plug the Title tags and descriptions in. So, if the page on Navigation is 'SHOP' this is in the Title tag box. How do I add my 70 characters of keywords? Thanks. JOE
Intermediate & Advanced SEO | | iJoe0 -
SEO Best Practices regarding Robots.txt disallow
I cannot find hard and fast direction about the following issue: It looks like the Robots.txt file on my server has been set up to disallow "account" and "search" pages within my site, so I am receiving warnings from the Google Search console that URLs are being blocked by Robots.txt. (Disallow: /Account/ and Disallow: /?search=). Do you recommend unblocking these URLs? I'm getting a warning that over 18,000 Urls are blocked by robots.txt. ("Sitemap contains urls which are blocked by robots.txt"). Seems that I wouldn't want that many urls blocked. ? Thank you!!
Intermediate & Advanced SEO | | jamiegriz0 -
Robots.txt, does it need preceding directory structure?
Do you need the entire preceding path in robots.txt for it to match? e.g: I know if i add Disallow: /fish to robots.txt it will block /fish
Intermediate & Advanced SEO | | Milian
/fish.html
/fish/salmon.html
/fishheads
/fishheads/yummy.html
/fish.php?id=anything But would it block?: en/fish
en/fish.html
en/fish/salmon.html
en/fishheads
en/fishheads/yummy.html
**en/fish.php?id=anything (taken from Robots.txt Specifications)** I'm hoping it actually wont match, that way writing this particular robots.txt will be much easier! As basically I'm wanting to block many URL that have BTS- in such as: http://www.example.com/BTS-something
http://www.example.com/BTS-somethingelse
http://www.example.com/BTS-thingybob But have other pages that I do not want blocked, in subfolders that also have BTS- in, such as: http://www.example.com/somesubfolder/BTS-thingy
http://www.example.com/anothersubfolder/BTS-otherthingy Thanks for listening0 -
Suggestions for a cost effective, SEO safe domain that was previously penalised
Hi, I will try to explain my situation as clearly as I can, and any positive advise would be greatly appreciated. Obviously please let me know if you have any questions but I'm sorry the domains are private. I started by business 4 years ago and launched a website (site A) and worked hard to promote it in the best way I knew how. It brought in a good income for around 3 years but then was hit with some sort of google penalty/filter! Knowing then what I know now would probably of avoided the problem altogether but that's another story... So I brought another domain (site B) and started again with a new design and completely re-branded the company, and again have been working hard to promote it. It has been preforming well and there has been steady progress 😉 However since then I have been steadily promoting the penalised site (site A) and keeping an eye on it to see if the penalty/filter may be lifted. It had initially lost around 75% of traffic, but recently has been doing much better in SERPS and is again on the increase. My problem is that with my company now being completely re branded I want to keep consistency but (site A) looks old and dated compared the new one (site B) and I don’t want to be confusing users etc... So I need a cost efficient and “safe” solution to this in terms of SEO and budget. 301 Redirect to site B I thought of a 301 redirect “BUT” I'm concerned about the penalty/filter being passed onto the new site (site B) and have read this dose happen ? Complete Redesign/brand for site A This would probably be the best option except I'm limited on funds. I would need “another” full commerce site as its just way to much money at the moment. Remove site A completely Funds are tight and I'm still feeling the affects of the penalty so really can't afford to loose any traffic at all! Use site A as a micro-site I thought a micro site with just the main product landing pages being used. I would use the same design as site B, then re-write the text and then link everything to the new site. “BUT” I'm concerned about getting another penalty (duplicate) as all the anchor text links going to site B site would be identical! EG. To use the same design as site B I would need to use the same layout etc including navbars, anchor text links in the footer etc.. and I'm worried this may trigger a duplicate content penalty ? I hope there are some suggestion for my situation and thanks in advance for your help. Thanks Chris.
Intermediate & Advanced SEO | | doorguy880 -
Adding index.php at the end of the url effect it's rankings
I have just had my site updated and we have put index.php at the end of all the urls. Not long after the sites rankings dropped. Checking the backlinks, they all go to (example) http://www.website.com and not http://www.website.com/index.php. So could this change have effected rankings even though it redirects to the new url?
Intermediate & Advanced SEO | | authoritysitebuilder0 -
Does having a file type on the end of a url affect rankings (example www.fourcolormagnets.com/business-cards.php VS www.fourcolormagnets.com/business-cards)????
Does having a file type on the end of a url affect rankings (example www.fourcolormagnets.com/business-cards.php VS www.fourcolormagnets.com/business-cards)????
Intermediate & Advanced SEO | | JHSpecialty0 -
Are there any benefits to having dashes in file names?
Through searching, I can find lots of discussion regarding "dash vs underscore", but am having trouble with an even simpler question: Is there any SEO difference between using http://www.broadway.com/shows/milliondollarquartet.php vs. http://www.broadway.com/shows/million-dollar-quartet.php
Intermediate & Advanced SEO | | RyanWhitney150