Longevity of robot.txt files on Google rankings
-
This may be a difficult question to answer without a ton more information, but I'm curious if there's any general thought that could shed some light on the following scenario I've recently heard about and wish to be able to offer some sound advice:
An extremely reputable non-profit site with excellent ranking had gone through a re-design and change-over into WordPress. A robots.txt file was used during development on the dev site on the dev server.
Two months later it was noticed through GA that traffic was way down to the site. It was then discovered that the robot.txt file hadn't been removed and the new site (same content, same nav) went live with it in place. It was removed and a site index forced. How long might it take for the site to re-appear and regain past standing in the SERPs if rankings have been damaged. What would the expected recovery time be?
-
They were paying attention to GA but lapsed and when they checked back in, saw a drop in traffic. Great point about that "critical" message.. The developers did force a crawl and I'm hoping you are correct about the time it might take.
-
Thank you methodicalweb. Great suggestions.
-
Thanks, Travis. You've offered a lot of very interesting points.
I will double-check that they have looked at the server log files, but I'm pretty confident that they have done that.
They did assure me that the proper redirects were done but I'm not sure what they did regarding extensions. There was also a server change.....
-
Thanks for clarifying KeriMorgret. Much appreciated. As are all your thoughts. I will definitely suggest that the monitoring software be used to avoid any future problems. This was such an unnecessary and frustrating experience.
-
If they were paying attention to WMT they would have seen a "critical" message that the site was blocked right away. Forcing a crawl (crawl all urls) should result in the site getting indexed extremely quickly. Rankings should return to where they were before.
-
The only thing I would add to the existing responses, is that if following a "site:www.mysite.com" query you notice that some key landing pages haven't been indexed then submit them via Webmaster Tools (Fetch as Google).
I would also make sure your sitemap is up to date and submitted via WMT too. It will also tell you how many of the sitemap URLs have been indexed.
These 2 things could speed up your re-indexing. My guess is that if it's a reputable site, and the migration of URLs was done properly, you'll probably get re-indexed quickly anyway.
George
-
Hi Gina,
Yes, that is what I mean. The dev team (or you, if you chose) would get an email that says the robots.txt file had changed. I was inhouse at a non-profit where we had an overseas dev team that wasn't too savvy about SEO, so I was the one who would get the emails, then go and send them an email asking them to fix it.
I don't believe there's a hard and fast answer here, as it in part depends on how quickly your site is crawled.
-
If possible, take a look at the server log files. That should give you a better idea of when/how often Google crawled the site in recent history. The user agent you're looking for is googlebot.
Aside from the robots.txt faux pas, it's also possible that the proper redirects weren't put in place. That would also account for a dip in traffic. Generally WordPress is extensionless. Which means any previous URL that contained an extension won't properly resolve - which means the site would lose a chunk of referral traffic and link equity if the URLs contained an extension (.php, .html, .aspx). Further, if the URL names have been changed from something like /our-non-profit.html to /about-our-non-profit those would require a redirect as well.
I've seen brand new domains index in a matter of days, then rank very well in as little as one month. But that's the exception, not the rule.
Provided proper redirects are in place and nothing too drastic happened to on-page considerations, I would guesstimate two weeks to a month. If you start heading into the month time frame, it's time to look a little deeper.
edit: If the server changed, that would also add another wrinkle to the problem. In the past, one of my lovely hosts decided to force a change on me. It took about a month to recover.
-
Thanks so much for your response KeriMorgret. I'm not sure I fully understand your suggestion unless you are saying that it would have alerted the dev team to the problem? I will pass this on to them and thank you if that is what your intention was.
The developer removed the robot.txt file which fixed the problem and I am trying to ascertain if there is a general expectation on how something like this - a de-indexing - gets reversed within the Google algorithm.
-
I don't know how long it will take for reindexing, but I do have a suggestion (have been in a real similar situation at a non-profit in the past).
Use a monitoring software like https://polepositionweb.com/roi/codemonitor/index.php that will check your robots.txt file daily on your live and any dev servers and email you if there is a change. Also, suggest that the live server's robots.txt file be made read-only, so it's harder to overwrite when updating the site.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Losing keyword rankings and sessions.
I had removed /en from end of each page ( www.mydomain.com/en become www.mydomain.com) Could it be the reason of losing keyword rankings and sessions?
Reporting & Analytics | | Jason2250 -
How is Google Analytics defining page depth?
We run two websites and as part of our KPIs we are treating those who visit 3 or more pages of our website as a client served. As a digital team we are not convinced that this is the best metric to use as the improvements we are making to the sites mean that people are able to find the information quicker. Additionally other organisations including forums etc link to us so those users will get the info they need in one click. What I would like to know is how Google calculates page depth in GA. Are they treating the landing page as ground zero and then when users clicks a link they go one page deep? Or is the landing page, page depth 1 . Is page depth a measure of how many clicks a user needs to find their information?
Reporting & Analytics | | MATOnlineServices0 -
Google Analytics - Adding a sub-domain
Hi I have a google analytics query.
Reporting & Analytics | | Niki_1
I have a main site with a google analytics tag and I have 2 forms that sit on a subdomain with a different GA code. As I would like to measure end to end tracking, I would like the same GA code on the subdomain. What is the best way for me to implement this? Would I need to make some changes to the GA code that sits on the main site or can I add the the GA code from the main site onto the subdomain? Thanks0 -
Image ranking plummet help
Hi Moz Community, I need help understanding why my site is dropping rankings and traffic significantly. We were hit the hardest with doing a image query filter ~6/2-6/3 (image attached). We also had a site migration on this exact date, but didn't change URL structure or domain names, just switched hosting sites. I'm assuming this plummet is from the migration, but can't figure out why. I may also not be understanding this filter clearly so let me know. Shoot any questions my way that could help your understanding of the problem be more clear. Thanks,
Reporting & Analytics | | IceIcebaby
-Reed yVzQRTK0 -
Google Ad referral
I was wondering if someone could decode the jumble of a referral - this is supposedly the referal that led to a click through to my site via a product listing ad. I am trying to figure out how www.nextag.com comes in to the picture as we do not have refurbexperts even listed there? Thanks to anyone who tries/does work it out. http://www.googleadservices.com/pagead/aclk?sa=L&ai=CGXud6DmDU_qeL5THygHpuICwCaTZwMYD_Nvvv0bEwMS50wEIBhAEIOn5-gEoBVCl7P7f-v____8BYMnu8omYpPQSoAHAhIv9A8gBB8gDG6oEJ0_QwcNc5zNun_d7S5KNcMT6uPjjH_mMDkKFFgBCQ6aKICRPJVVa7MAFBYgGAaAGJoAHqPv0ApAHAeASupqdo-ypit0m&ohost=www.google.com&cid=5GhZEzUCSC6x9n2wxOdz3-mrAfSUkvHKPN3wD5yLInnlNil_&sig=AOD64_1D1z1JPYbFP0UnUglJVOfvd25RfA&adurl=http://refurbexperts.com/product/527/HP-LaserJet-P2015-Laser-Printer-RECONDITIONED%3Futm_source%3Dproductlistingads%26utm_medium%3Dadwords%26utm_campaign%3Dadwords&ctype=5&nb=0&res_url=http%3A%2F%2Fwww.nextag.com%2Fhp-p2015-laserjet%2Fproducts-html%3Fnxtg%3D116d0a1c0504-9FFEB16DE52A7E2A&rurl=http%3A%2F%2Fwww.nextag.com%2Fgoto.jsp%3Fp%3D3652%26search%3Dhp%2520p2015%2520laserjet%26t%3Dag%253D1384181795%26crid%3D48271786%26gg_aid%3D20169721025%26gg_site%3D%26gclid%3DCjgKEAjwzIucBRDzjIz9qMOB3TASJABBIwL1LHK7GcAPS6yHGpd9Kq3wsZrcPORAWD8QCWivr4W75PD_BwE&nm=11&nx=43&ny=12&is=700x181&clkt=187
Reporting & Analytics | | henya0 -
Google Analytics - my continuing adventures
Hello I'd appreciate views of the various metrics I'm struggling with in GA: I've run 2 different reports that provide 2 different outputs. 1. In Standard Reporting you can report in Traffic Sources on Organic Search by Keyword, which returns the number of Visits. 2. In Custom Reporting you can define the Keyword dimension and the Organic Searches metric, which returns the number of Organic Searches. This returns 2 different numbers. For example, over the last month for a given term report 1 returns 77,306 visits whilst report 2 returns 52,589 organic searches. I have found some definitions: "Visits represent the number of individual sessions initiated by all the visitors to your site." "Organic Searches: number of organic searches that happened within a session. This metric is search engine agnostic." My understanding of these definitions is that report 2 should return a larger value than report 1 rather than what is happening (i.e. report 1 returns a greater value than report 2). Does anyone have a greater understanding of what these mean and relate to? Does anyone have any views on which metric is more useful? Thanks Neil
Reporting & Analytics | | mccormackmorrison0 -
Ranking accuracy
On one of my campaigns the ranking in SEOMoz for a search term at Google UK is 34, but when I run an analysis using different software it gives a ranking of 8. Search the term on Google and 8 is correct, not 34. Why should this be?
Reporting & Analytics | | Beemer0 -
How to change a url for google analytics account
We recently changed the url of a client's website. Is there a way to change the url on the ga account instead of creating a new account so that we don't lose comparative data? Thanks! Sorry- I know this is a novice question.
Reporting & Analytics | | marketing12340