Longevity of robot.txt files on Google rankings
-
This may be a difficult question to answer without a ton more information, but I'm curious if there's any general thought that could shed some light on the following scenario I've recently heard about and wish to be able to offer some sound advice:
An extremely reputable non-profit site with excellent ranking had gone through a re-design and change-over into WordPress. A robots.txt file was used during development on the dev site on the dev server.
Two months later it was noticed through GA that traffic was way down to the site. It was then discovered that the robot.txt file hadn't been removed and the new site (same content, same nav) went live with it in place. It was removed and a site index forced. How long might it take for the site to re-appear and regain past standing in the SERPs if rankings have been damaged. What would the expected recovery time be?
-
They were paying attention to GA but lapsed and when they checked back in, saw a drop in traffic. Great point about that "critical" message.. The developers did force a crawl and I'm hoping you are correct about the time it might take.
-
Thank you methodicalweb. Great suggestions.
-
Thanks, Travis. You've offered a lot of very interesting points.
I will double-check that they have looked at the server log files, but I'm pretty confident that they have done that.
They did assure me that the proper redirects were done but I'm not sure what they did regarding extensions. There was also a server change.....
-
Thanks for clarifying KeriMorgret. Much appreciated. As are all your thoughts. I will definitely suggest that the monitoring software be used to avoid any future problems. This was such an unnecessary and frustrating experience.
-
If they were paying attention to WMT they would have seen a "critical" message that the site was blocked right away. Forcing a crawl (crawl all urls) should result in the site getting indexed extremely quickly. Rankings should return to where they were before.
-
The only thing I would add to the existing responses, is that if following a "site:www.mysite.com" query you notice that some key landing pages haven't been indexed then submit them via Webmaster Tools (Fetch as Google).
I would also make sure your sitemap is up to date and submitted via WMT too. It will also tell you how many of the sitemap URLs have been indexed.
These 2 things could speed up your re-indexing. My guess is that if it's a reputable site, and the migration of URLs was done properly, you'll probably get re-indexed quickly anyway.
George
-
Hi Gina,
Yes, that is what I mean. The dev team (or you, if you chose) would get an email that says the robots.txt file had changed. I was inhouse at a non-profit where we had an overseas dev team that wasn't too savvy about SEO, so I was the one who would get the emails, then go and send them an email asking them to fix it.
I don't believe there's a hard and fast answer here, as it in part depends on how quickly your site is crawled.
-
If possible, take a look at the server log files. That should give you a better idea of when/how often Google crawled the site in recent history. The user agent you're looking for is googlebot.
Aside from the robots.txt faux pas, it's also possible that the proper redirects weren't put in place. That would also account for a dip in traffic. Generally WordPress is extensionless. Which means any previous URL that contained an extension won't properly resolve - which means the site would lose a chunk of referral traffic and link equity if the URLs contained an extension (.php, .html, .aspx). Further, if the URL names have been changed from something like /our-non-profit.html to /about-our-non-profit those would require a redirect as well.
I've seen brand new domains index in a matter of days, then rank very well in as little as one month. But that's the exception, not the rule.
Provided proper redirects are in place and nothing too drastic happened to on-page considerations, I would guesstimate two weeks to a month. If you start heading into the month time frame, it's time to look a little deeper.
edit: If the server changed, that would also add another wrinkle to the problem. In the past, one of my lovely hosts decided to force a change on me. It took about a month to recover.
-
Thanks so much for your response KeriMorgret. I'm not sure I fully understand your suggestion unless you are saying that it would have alerted the dev team to the problem? I will pass this on to them and thank you if that is what your intention was.
The developer removed the robot.txt file which fixed the problem and I am trying to ascertain if there is a general expectation on how something like this - a de-indexing - gets reversed within the Google algorithm.
-
I don't know how long it will take for reindexing, but I do have a suggestion (have been in a real similar situation at a non-profit in the past).
Use a monitoring software like https://polepositionweb.com/roi/codemonitor/index.php that will check your robots.txt file daily on your live and any dev servers and email you if there is a change. Also, suggest that the live server's robots.txt file be made read-only, so it's harder to overwrite when updating the site.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Blocking Standard pages with Robots.txt (t&c's, shipping policy, pricing & privacy policies etc)
Hi I've just had best practice site migration completed for my old e-commerce store into a Shopify environment and I see in GSC that it's reporting my standard pages as blocked by robots.txt, such as these below examples. Surely I don't want these blocked ? is that likely due to my migrators or s defaults setting with Shopify does anyone know? : t&c's shipping policy pricing policy privacy policy etc So in summary: Shall I unblock these? What caused it Shopify default settings or more likely my migration team? All Best Dan
Reporting & Analytics | | Dan-Lawrence0 -
Is having the same URL in several sitemaps a problem for google?
We have 30 sitemaps, one for each language version of our site. About 5000 pages per sitemap.
Reporting & Analytics | | lcourse
To get a better idea on which pages google is not indexing, I thought about quickly generating sitemaps by page cagetories to see if there are any patterns. Any problems if I submit now new additional sitemaps dividing all our pages by product page, considering that the same pages are already in our existing sitemaps we submitted in the search console. So having same URL in more than 1 sitemap would be a problem? As a side note, we observed when adding a sitemap index that google search console in its count of total indexed pages, now counts every page twice since we submitted both the sitemap index and the individual sitemaps, so search console does not recognize in count that sitemaps in sitemaps index are identical to the ones we submitted individually in search console.0 -
Advanced Web Ranking Proxies
Hey, How can i set up multiple proxy server for Advanced Web Ranking? I managed to block everyone from Google the last time I used this 😕 Thanks, Luke.
Reporting & Analytics | | NoisyLittleMonkey0 -
How long does Google Analytics store data?
Hello All, How long analytics keep the data for one website? at least two years at least 25 months other I guess that they guarantee at least 25 months, but it might be more.
Reporting & Analytics | | CommercePundit
Anyone has any other suggestion? Thanks,0 -
Google analytics - landing pages
Hello, I have a website and it contains landing pages. after tracking and analyzing the data in Google Analytics , I found out that the traffic for the whole site is affected by the visits to the landing page. I mean, the bounce rate is less than one minute. this is because users fill their details in the landing pages and leave my website. to improve my website, I need clear data about the traffic, bounce rate etc. in my website. I also need can you help me solve this problem I thought about create another account in Google Analytics and set it to my websites pages (not including the landing pages). and keep the current code for the landing pages. my question is regarding this: Is it legal to use 2 different GA code (each one of them belong to different account) in the same domain ? can you provide me with more information about Multi Account in Google Analytics, and how can I use it to divide the traffic in my website between the traffic and the data for the landing pages and for the website itself ?
Reporting & Analytics | | JonsonSwartz0 -
Major practices which helps to index pages by google.
Actually, We have submitted more than 100 pages in to google through xml sitemap. But, we see in that 75% of the pages where indexed by google. Note : Excluding the duplicate pages
Reporting & Analytics | | Webworld_Norway0 -
Google Analytics & Omniture Discrepancies
I am seeing a significant difference between my traffic numbers in Google Analytics and Omniture (Omniture has significantly more). I do not expect them to report exactly the same numbers but these are just too far off. Any idea why that is, or which one I should trust more? Thanks!
Reporting & Analytics | | emediaSEO0 -
Google anomaly
Hi, As per Google's Keyword Tool , the exact search volume for a particular keyword is 22000. As per Google's Webmaster Tool , my SER for the same keyword is 5.8 BUT , I see only 500 impressions for this keyword in webmaster ! Can someone help decipher this behavior ?
Reporting & Analytics | | iamnew0