Recovering from Programmers Error
-
Hey Everybody!
Last year one of my bigger sites hit a snaffu. I was getting about 300k + hits a day from google, and then, when a developper released an update with a robots.txt file that basically blocked google from the entire site.
We didn't notice the bug until a few days later, but by then, it was already too late. My google traffic dropped to 30k a day and I've been having the hardest time coming back ever since.
As a matter of fact, hundreds of sites that were aggregating my content started outranking me for my own terms.
For over a year, I've been working on building what I lost back and everything seemed to be coming together. I was back at 100k+ hits a day
Until today... My developpers repeated the exact same error as last year. They blocked google from crawling my site for over 5 days and now I'm down to 10k se hits a day.
My question : Has anyone encountered this problem before and what did you do to come back?
-
My Friend,
I was having the exactly same problem, and finally my solution was add at operating system level the "Read-Only" attribute for this file.
Hope it help
Claudio
-
I don't have an answer for your traffic, but I've had similar experiences with developers. I ended up using Code Monitor from Pole Position at https://polepositionweb.com/roi/codemonitor/index.php. I had it monitor the contents of the robots.txt file for the live site and all dev sites. Once a day it would check the file for any changes, and email me if there were changes, so I had a max lag time of 24 hours to be notified that the devs had done something again.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What should I do after a failed request for validation (error with noindex, nofollow) in new Google Search Console?
Hi guys, We have the following situation: After an error message in new google search console for a large amount of pages with noindex, nofollow tag, a validation is requested before the problem is fixed. (it's incredibly stupid decision taken before asking the SEO team for advice) Google starts the validation, crawls 9 URLs and changes the status to "Failed". All other URLs are still in "pending" status. The problem has been fixed for more than 10 days, but apparently Google doesn't crawl the pages and none of the URLs is back in the index. We tried pinging several pages and html sitemaps, but there is no result. Do you think we should request for re-validation or wait more time? It there something more we could do to speed up the process?
Intermediate & Advanced SEO | | ParisChildress0 -
I have 6 URL errors in GSC showing a 500 error code. How do I fix?
I am not sure how to fix some errors that are popping up in Google Search Console. The response codes showing are all: 500 error code I need some advice as to how to fix these. What are my options?
Intermediate & Advanced SEO | | pmull0 -
18,000 'Title Element is too Long' Errors
How detrimental is this in the overall SEO scheme of things? Having checked 3 of our main competitors, they too seem to have similar issues... I am trying to look at a solution but it is proving very difficult! Thanks Andy
Intermediate & Advanced SEO | | TomKing0 -
Is there anyway to recover my site's rankings?
My site has been top 3 for 'speed dating' on Google.co.uk since about 2003 and it went to below top 50 for a lot of it's main keywords shortly after 27 Oct 2012. I did a re-submission request and was told there was 'no manual spam action'. My conclusions is I was dropped by Google because of poor quality links I've gained over 10+ years. I have a Domain Authority of 40, a regular blog http://bit.ly/oKyi88, a KLOUT of 42, user reviews and quality content. Since Oct 2012 I've done some technical improvements and managed to get a few questionable links removed. I've continued blogging reguarly and got more active on Twitter. I've seen no improvement and my traffic is 80% down on last year. It would be great to be able to produce content that others want to link to but I've not had much success from that in over 10 years of trying and I've not seen many others in my sector, with small budgets having much success. Is there anything I can do to regain favour with Google?
Intermediate & Advanced SEO | | benners0 -
How important is it to fix Server Errors?
I know it is important to fix server errors. We are trying to figure out how important because after our last build we have over 19,646 of them and since google only gives us a 1000 at a time the fastest way to tell them we have fixed them all is to use the api etc which will take time. WE are trying to decide is it more important to fix all these errors right now or focus on other issues and fix these errors when we have time, they are mostly ajax errors. Could this hurt our rankings? Any thoughts would be great!
Intermediate & Advanced SEO | | DoRM0 -
Fixing Duplicate Content Errors
SEOMOZ Pro is showing some duplicate content errors and wondered the best way to fix them other than re-writing the content. Should I just remove the pages found or should I set up permanent re-directs through to the home page in case there is any link value or visitors on these duplicate pages? Thanks.
Intermediate & Advanced SEO | | benners0 -
How do I get rid of all the 404 errors in google webmaster tools after building a new website under the same domiain
I recently launched my new website under the same domain as the old one. I did all the important 301 redirects but it seems like every url that was in google index is still their but now with a 404 error code. How can I get rid of this problem? For example if you google my company name 'romancing diamonds' half the link under the name are 404 errors. Look at my webmaster tools and you'll see the same thing. Is their anyway to remove all those previous url's from google's indexes and start anew? Shawn
Intermediate & Advanced SEO | | Romancing0 -
What has this subdomain done to recover from Panda?
I found that doctor.webmd.com was affected by Google Panda, and then recovered (if you look at traffic on compete.com). What do you think they did to recover?
Intermediate & Advanced SEO | | nicole.healthline0