Robots.txt was set to disallow for 14 days
-
We updated our website and accidentally overwrote our robots file with a version that prevented crawling ( "Disallow: /") We realized the issue 14 days later and replaced after our organic visits began to drop significantly and we quickly replace the robots file with the correct version to begin crawling again. With the impact to our organic visits, we have a few and any help would be greatly appreciated -
Will the site get back to its original status/ranking ?
If so .. how long would that take?
Is there anything we can do to speed up the process ?
Thanks
-
Thank you for the response.
We have been watching over the past week and there has been a very small change in the number of indexed urls in GSC and no change in the stats on the MOZ dashboard.
Is that normal? How often does MOZ update the stats?
-
This is commonly done intentionally, when launching a site on a new domain. Once the disallow is removed, the general practice is to request reindexing of the root domain page (and possibly some key pages with paths not likely to be found through navigation) in GSC, and also submitting (or re-submitting) your sitemaps directly in GSC (even though they also may/should be in your robots.txt file).
I'm not sure how long you can expect the search engines to take, since your situation is a bit unique where the site was indexed, and then disallowed temporarily. Just guessing based on launching brand new domains, the process should be quick to be indexed (perhaps a few days) but might be slower on regaining previous ranking positions (unsure on timing of this).
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What happens to crawled URLs subsequently blocked by robots.txt?
We have a very large store with 278,146 individual product pages. Since these are all various sizes and packaging quantities of less than 200 product categories my feeling is that Google would be better off making sure our category pages are indexed. I would like to block all product pages via robots.txt until we are sure all category pages are indexed, then unblock them. Our product pages rarely change, no ratings or product reviews so there is little reason for a search engine to revisit a product page. The sales team is afraid blocking a previously indexed product page will result in in it being removed from the Google index and would prefer to submit the categories by hand, 10 per day via requested crawling. Which is the better practice?
Intermediate & Advanced SEO | | AspenFasteners1 -
Recovering old disallow file?
Hi guys, We had aN SEO agency do a disallow request on one of our sites a while back. They have no trace of the disallow txt file and all the links they disallowed. Does anyone know if there is a way to recover this file in google webmaster tools or anyway to find which links were disallowed? Cheers.
Intermediate & Advanced SEO | | jayoliverwright0 -
Have I set up my structured data correctly, the testing tool suggests not?
Hi, I've recently marked up some Events for a client in hope that they'll appear as rich snippets in ther SERPS. I have access to their Google Search Console so used the Data Highlighter facility to mark them up, rather than the Raven plugin available for WordPress sites like this. I completed this on 10th July and the snippets are yet to appear - I understand that this can take time and there are no guarantees - but as a novice it would be reassuring if someone can advise that I have done this correctly. We did incidentally resubmit a sitemap after completing this task, but I'm not sure if that makes any difference. I've read that it's the structured data testing tool that I need to use to test my markup, but when I input the urls below, the tool doesn't tell me a lot, which either suggests I've marked it up incorrectly, or don't know how to read it! http://www.ad-esse.com/events/19th-august-2015-reducing-costs-changing-culture-improving-services/
Intermediate & Advanced SEO | | nathangdavidson
http://www.ad-esse.com/events/160915-reducing-costs-changing-culture-improving-services-london/
http://www.ad-esse.com/events/151015-reducing-costs-changing-culture-improving-services-london/ Any guidance welcomed! Many thanks,
Nathan0 -
Setting up 301 Redirects after acquisition?
Hello! The company that I work for has recently acquired two other companies. I was wondering what the best strategy would be as it relates to redirects / authority. Please help! Thanks
Intermediate & Advanced SEO | | Colin.Accela0 -
Can't find X-Robots tag!
Hi all. I've been checking out http://www.unthankbooks.com/ as it seems to have some indexing problems. I ran a server header check, and got a 200 response. However, it also shows the following: X-Robots-Tag:
Intermediate & Advanced SEO | | Blink-SEO
noindex, nofollow It's not in the page HTML though. Could it be being picked up from somewhere else?0 -
Does using robots.txt to block pages decrease search traffic?
I know you can use robots.txt to tell search engines not to spend their resources crawling certain pages. So, if you have a section of your website that is good content, but is never updated, and you want the search engines to index new content faster, would it work to block the good, un-changed content with robots.txt? Would this content loose any search traffic if it were blocked by robots.txt? Does anyone have any available case studies?
Intermediate & Advanced SEO | | nicole.healthline0 -
What would cause a drastic drop in pages crawled per day?
The site didn't go down. There were no drop in rankings, or traffic. But we went from averaging 150,000 pages crawled per day, to ~1000 pages crawled per day. We're now back up to ~100,000 crawled per day, but we went more than a week with only 1000 pages being crawled daily. The question is, what could cause this drastic (but temporary) reduction in pages crawled?
Intermediate & Advanced SEO | | Fatwallet0 -
Are there certain times of the day that it is better to update content or blogs? How do I find out what time is best for a particular site?
Trying to figure out how to best optimize timing of new content... including blogs and other on page content?
Intermediate & Advanced SEO | | AaronSchinke0