Longevity of robot.txt files on Google rankings
-
This may be a difficult question to answer without a ton more information, but I'm curious if there's any general thought that could shed some light on the following scenario I've recently heard about and wish to be able to offer some sound advice:
An extremely reputable non-profit site with excellent ranking had gone through a re-design and change-over into WordPress. A robots.txt file was used during development on the dev site on the dev server.
Two months later it was noticed through GA that traffic was way down to the site. It was then discovered that the robot.txt file hadn't been removed and the new site (same content, same nav) went live with it in place. It was removed and a site index forced. How long might it take for the site to re-appear and regain past standing in the SERPs if rankings have been damaged. What would the expected recovery time be?
-
They were paying attention to GA but lapsed and when they checked back in, saw a drop in traffic. Great point about that "critical" message.. The developers did force a crawl and I'm hoping you are correct about the time it might take.
-
Thank you methodicalweb. Great suggestions.
-
Thanks, Travis. You've offered a lot of very interesting points.
I will double-check that they have looked at the server log files, but I'm pretty confident that they have done that.
They did assure me that the proper redirects were done but I'm not sure what they did regarding extensions. There was also a server change.....
-
Thanks for clarifying KeriMorgret. Much appreciated. As are all your thoughts. I will definitely suggest that the monitoring software be used to avoid any future problems. This was such an unnecessary and frustrating experience.
-
If they were paying attention to WMT they would have seen a "critical" message that the site was blocked right away. Forcing a crawl (crawl all urls) should result in the site getting indexed extremely quickly. Rankings should return to where they were before.
-
The only thing I would add to the existing responses, is that if following a "site:www.mysite.com" query you notice that some key landing pages haven't been indexed then submit them via Webmaster Tools (Fetch as Google).
I would also make sure your sitemap is up to date and submitted via WMT too. It will also tell you how many of the sitemap URLs have been indexed.
These 2 things could speed up your re-indexing. My guess is that if it's a reputable site, and the migration of URLs was done properly, you'll probably get re-indexed quickly anyway.
George
-
Hi Gina,
Yes, that is what I mean. The dev team (or you, if you chose) would get an email that says the robots.txt file had changed. I was inhouse at a non-profit where we had an overseas dev team that wasn't too savvy about SEO, so I was the one who would get the emails, then go and send them an email asking them to fix it.
I don't believe there's a hard and fast answer here, as it in part depends on how quickly your site is crawled.
-
If possible, take a look at the server log files. That should give you a better idea of when/how often Google crawled the site in recent history. The user agent you're looking for is googlebot.
Aside from the robots.txt faux pas, it's also possible that the proper redirects weren't put in place. That would also account for a dip in traffic. Generally WordPress is extensionless. Which means any previous URL that contained an extension won't properly resolve - which means the site would lose a chunk of referral traffic and link equity if the URLs contained an extension (.php, .html, .aspx). Further, if the URL names have been changed from something like /our-non-profit.html to /about-our-non-profit those would require a redirect as well.
I've seen brand new domains index in a matter of days, then rank very well in as little as one month. But that's the exception, not the rule.
Provided proper redirects are in place and nothing too drastic happened to on-page considerations, I would guesstimate two weeks to a month. If you start heading into the month time frame, it's time to look a little deeper.
edit: If the server changed, that would also add another wrinkle to the problem. In the past, one of my lovely hosts decided to force a change on me. It took about a month to recover.
-
Thanks so much for your response KeriMorgret. I'm not sure I fully understand your suggestion unless you are saying that it would have alerted the dev team to the problem? I will pass this on to them and thank you if that is what your intention was.
The developer removed the robot.txt file which fixed the problem and I am trying to ascertain if there is a general expectation on how something like this - a de-indexing - gets reversed within the Google algorithm.
-
I don't know how long it will take for reindexing, but I do have a suggestion (have been in a real similar situation at a non-profit in the past).
Use a monitoring software like https://polepositionweb.com/roi/codemonitor/index.php that will check your robots.txt file daily on your live and any dev servers and email you if there is a change. Also, suggest that the live server's robots.txt file be made read-only, so it's harder to overwrite when updating the site.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Page Speed or Site Speed which one does Google considered a ranking signal
I've read many threads online which proves that website speed is a ranking factor. There's a friend whose website scores 44 (slow metric score) on Google Pagespeed Insights. Despite that his website is slow, he outranks me on Google search results. It confuses me that I optimized my website for speed, but my competitor's slow site outperforms me. On Six9ja.com, I did amazing work by getting my target score which is 100 (fast metric score) on Google Pagespeed Insights. Coming to my Google search console tool, they have shown that some of my pages have average scores, while some have slow scores. Google search console tool proves me wrong that none of my pages are fast. Then where did the fast metrics went? Could it be because I added three Adsense Javascript code to all my blog posts? If so, that means that Adsense code is slowing website speed performance despite having an async tag. I tested my blog post speed and I understand that my page speed reduced by 48 due to the 3 Adsense javascript codes added to it. I got 62 (Average metric score). Now, my site speed is=100, then my page speed=62 Does this mean that Google considers page speed rather than site speed as a ranking factor? Screenshots: https://imgur.com/a/YSxSwOG **Regarding: **https://six9ja.com/
Reporting & Analytics | | Kingsmart1 -
Google Tag Manager breaking integration
Using Live Chat's (www.livechatinc.com) Google Analytics integration was populating events and virtual pageviews into my analytics account. I've since added Tag Manager and moved my analytics tracking code into there, but since doing so, the integration no longer seems to work as there is no population of either events or pageviews anymore. Anyone else had any experience of something similar? Any other suggestions (beyond not using GTM for analytics code anymore)? I was considering setting up the event tracking code manually in GTM, but not really sure how to do so seeing as I'm not sure what to fire the different events on. This is the live chat JS code:
Reporting & Analytics | | AdrianCordiner0 -
AHHH... Google Analytics just changed!
There's now an acquisitions tab.. but I can't figure out how to sort referral traffic by referral path.
Reporting & Analytics | | S.S.N0 -
Google Analytics & Omniture Discrepancies
I am seeing a significant difference between my traffic numbers in Google Analytics and Omniture (Omniture has significantly more). I do not expect them to report exactly the same numbers but these are just too far off. Any idea why that is, or which one I should trust more? Thanks!
Reporting & Analytics | | emediaSEO0 -
Using Regex for Goals in Google Analytics
I’ve got a website with 20 different forms; and I’d like to track all 20 form completions with one Goal in Google Analytics. The completion-page URLs follow the pattern below (the website uses Drupal CMS) Form One: www.acme.com/node/2990/done?sid=651 Form Two: www.acme.com /node/2991/done?sid=785 Form Three: www.acme.com /node/2992/done?sid=1021 Form Four: www.acme.com /node/2993/done?sid=459 I believe that there is a way I can use regex so Google Analytics will track if any of these completion URLs appears. Looking for guidance how to format the regex statement.
Reporting & Analytics | | TopFloor0 -
Google Analytics - New account creation broken
Is it just me or is the new Google Account Creation button not working. After i set it up, it just simply doesn't save.
Reporting & Analytics | | TheGrid0 -
How do I unlink my Google Analytics Account from SEOmoz?
Hi, I am new to the community and have a very basic question concerning the internal settings for SEOmoz. How do I unlink the Google Analytics Account from SEOmoz. I am sure there is an easy answer to this. Thanks for helping!
Reporting & Analytics | | sandoery0 -
Google Keyword Tool versus Google Analytics
Hi I'm trying to establish a methodology to best show the gap between potential and realised organic keyword traffic. To obtain potential keyword traffic I'm using the Google Adwords keyword tool to derive local monthly search volumes for exact keyword matches. To get the realised data I'm using Google Analytics. However, to get the I'm confused as to which is the best way of getting a comparable metric from Google Analytics (GA). I was using custom reports and the 'organic searches' metric. However, this provides different values to a standard report selecting non-paid search in the default advanced segments. What is the best report/metric in GA to use for both organic and paid search volumes that would be comparable to the Google Adwords keyword tool. Thanks Neil
Reporting & Analytics | | mccormackmorrison0