How long does it take for traffic to bounce back from and accidental robots.txt disallow of root?
-
We accidentally uploaded a robots.txt disallow root for all agents last Tuesday and did not catch the error until yesterday.. so 6 days total of exposure. Organic traffic is down 20%. Google has since indexed the correct version of the robots.txt file. However, we're still seeing awful titles/descriptions in the SERPs and traffic is not coming back. GWT shows that not many pages were actually removed from the index but we're still seeing drastic rankings decreases.
Anyone been through this? Any sort of timeline for a recovery?
Much appreciated!
-
I'd give it a month before you'll see that bounce back. I wouldn't expect the same rankings as before as Google will be reevaluating your pages, but you should have a pick up in traffic.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Crawl solutions for landing pages that don't contain a robots.txt file?
My site (www.nomader.com) is currently built on Instapage, which does not offer the ability to add a robots.txt file. I plan to migrate to a Shopify site in the coming months, but for now the Instapage site is my primary website. In the interim, would you suggest that I manually request a Google crawl through the search console tool? If so, how often? Any other suggestions for countering this Meta Noindex issue?
Technical SEO | | Nomader1 -
Will a Robots.txt 'disallow' of a directory, keep Google from seeing 301 redirects for pages/files within the directory?
Hi- I have a client that had thousands of dynamic php pages indexed by Google that shouldn't have been. He has since blocked these php pages via robots.txt disallow. Unfortunately, many of those php pages were linked to by high quality sites mulitiple times (instead of the static urls) before he put up the php 'disallow'. If we create 301 redirects for some of these php URLs that area still showing high value backlinks and send them to the correct static URLs, will Google even see these 301 redirects and pass link value to the proper static URLs? Or will the robots.txt keep Google away and we lose all these high quality backlinks? I guess the same question applies if we use the canonical tag instead of the 301. Will the robots.txt keep Google from seeing the canonical tags on the php pages? Thanks very much, V
Technical SEO | | Voodak0 -
Sudden traffic down after mobile site
Hi to all. I have a huge problem about my web site traffic. We published our mobile site version under mobile domain in m.etilercerrental.com. And we published mobile site sitemap under main site such as sitemap-mobile.xml and notify google from webmaster tools. After a while whole site traffic gone down so badly. Almost 60% gone down which doesnt make sense. Mobile sitemap is hosted under desktop site. And we didnt add any link to switch sites between mobile and desktop version. We automaticly redirect between desktop and mobile versiyons by detecting useragent. One another this is we have announcement section in dekstop version but we dont have it in mobile. But we are not making any redirection to mobile site when user visits the announcement section even by mobile phone. Because we are not displaying it in mobile version.
Technical SEO | | gkhnrtk
One last this is we are making redirections automatically and not asking user preference such as showing a link in the bottom of page to switch sites betwenn mobile and pc version. While developing mobile site we followed instructions in this https://developers.google.com/webmasters/smartphone-sites/?hl=tr For find the details I would like to give you some site urls. Site Url : http://www.etilercarrental.com/
Sitemap Xml : http://www.etilercarrental.com/sitemap.xml
Mobile Sitemap Xml : http://www.etilercarrental.com/sitemap-mobile.xml0 -
Robots.txt Download vs Cache
We made an update to the Robots.txt file this morning after the initial download of the robots.txt file. I then submitted the page through Fetch as Google bot to get the changes in asap. The cache time stamp on the page now shows Sep 27, 2013 15:35:28 GMT. I believe that would put the cache time stamp at about 6 hours ago. However the Blocked URLs tab in Google WMT shows the robots.txt last downloaded at 14 hours ago - and therefore it's showing the old file. This leads me to believe for the Robots.txt the cache date and the download time are independent. Is there anyway to get Google to recognize the new file other than waiting this out??
Technical SEO | | Rich_A0 -
Empty Meta Robots Directive - Harmful?
Hi, We had a coding update and a side-effect of that was that our directive was emptied, in other words it now reads as: on all of the site. I've since noticed that Google's cache date on all of the pages - at least, the ones I tested - have a Cached date of no later than 17 December '12 - that's the Monday after the directive was removed on mass. So, A, does anyone have solid evidence of an empty directive causing problems? Past experience, Matt Cutts, Fishkin quote, etc. And then B - It seems fairly well correlated but, does my entire site's homogenous Cached date point to this tag removal? Or is it fairly normal to have a particular cache date across a large site (we're a large ecommerce site). Our site: http://www.zando.co.za/ I'm having the directive reinstated as soon as Dev permitting. And then, for extra credit, is there a way with Google's API, or perhaps some other tool, to run an arbitrary list and retrieve Cached dates? I'd want to do this for diagnosis purposes and preferably in a way that OK with Google. I'd avoid CURLing for the cached URL and scraping out that dates with BASH, or any such kind of thing. Cheers,
Technical SEO | | RocketZando0 -
How long should you keep 301 redirects?
Hi, Back in 2009 I decided to update an older site from .htm and .shtml to .php. In order to minimize the impact I would go in every month and do a 301 redirect on the .shtml page to the new .php page. So I have many that range from 2009 through 2010. I had left the old 301's because I felt they would only be used if needed but I would think I should clean up my .htaccess by removing the old 301 redirects if they are not needed. How long should you keep this type of 301 redirect? Thanks!
Technical SEO | | Force70 -
Best Redirect for old .htm extention to root ?
I have been trying to figure this out with different redirects but cannot seem to get this correct. Some of our forums link to pages that do not exist or are very old. They have (.htm) extension. I do not want to redirect the .htm to .php because the actual names of the link have changed too. What is the best code to redirect any link that has a .htm extention to the root domain? right now I have this code to redirect index.htm to the root, but that is all it works for. I think. RewriteCond %{THE_REQUEST} ^.*/index.htm RewriteRule ^(.*)index.htm$ http://www.example.com$1 [R=301,L]
Technical SEO | | hfranz0 -
How long does it take open site explorer to recognize new links?
I'm building a steady link profile to one of my websites and the new links still haven't shown up in open site explorer even after 2 months. How long does it take OSE to recognize new backlinks?
Technical SEO | | C-Style2