Recovering from Programmers Error
-
Hey Everybody!
Last year one of my bigger sites hit a snaffu. I was getting about 300k + hits a day from google, and then, when a developper released an update with a robots.txt file that basically blocked google from the entire site.
We didn't notice the bug until a few days later, but by then, it was already too late. My google traffic dropped to 30k a day and I've been having the hardest time coming back ever since.
As a matter of fact, hundreds of sites that were aggregating my content started outranking me for my own terms.
For over a year, I've been working on building what I lost back and everything seemed to be coming together. I was back at 100k+ hits a day
Until today... My developpers repeated the exact same error as last year. They blocked google from crawling my site for over 5 days and now I'm down to 10k se hits a day.
My question : Has anyone encountered this problem before and what did you do to come back?
-
My Friend,
I was having the exactly same problem, and finally my solution was add at operating system level the "Read-Only" attribute for this file.
Hope it help
Claudio
-
I don't have an answer for your traffic, but I've had similar experiences with developers. I ended up using Code Monitor from Pole Position at https://polepositionweb.com/roi/codemonitor/index.php. I had it monitor the contents of the robots.txt file for the live site and all dev sites. Once a day it would check the file for any changes, and email me if there were changes, so I had a max lag time of 24 hours to be notified that the devs had done something again.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How recovering the ranking after an hacking
Hello, I'm Alexia and a few months ago (end of March) my site has been hacked: hackers have created more than 30.000 links in Japanese to sell tires. I've successfully removed the hack and after 14 days of struggle even decided to change the domain to Siteground as they've been really keen to help. I still have some problems and I desperately need your tips. In search console, Google is informing about the +30.000 404 errors due to the content created by hackers which is not available anymore. I've been advised to redirect those links to 410 as they might have penalty effects in the SERP I have 50 503 server errors recognised by Google back in April but still there. What should I do to solve them? I still have a lot of traffic from Japan, even if I've removed all the content and ask Googled to disavow spamming backlinks. Do you think I have on page keywords? I don't understand how they can still find me. Those KWs are indexed in analytics, but not effective clicks, as the content is not there anymore. I also asked Google to remove links in search console with the tool removing links but not all of my requests have been accepted. My site disappeared from the organic results even if it hasn't been recognised as hacked in Google (there wasn't any manual actions on the Search Console). What can I do to gain the organic positioning once again? I've just tried to use the “Fetch as Google” option on search console for the entire website. Thank you all and I look forward to your replies. Thanks! Alessia
Intermediate & Advanced SEO | | AlessiaCamera0 -
URL Errors in webmaster tools to pages that don't exist?
Hello, for sometime now we have URLs showing up in Google webmaster saying these are 404 errors but don't exist on our website.......but also never have? Heres an example cosmetic-dentistry/28yearold-southport-dentist-wins-best-young-dentist-award/801530293 The root being this goo.gl/vi4N4F Really confused about this? We have recently made our website wordpress? Thanks Ade
Intermediate & Advanced SEO | | popcreativeltd0 -
404 Errors
Do 404 Errors really have a lot of impact on rankings and over all authority of the site with google? Say you have a site that all the pages have moved apart from the home page which is exactly the same before moving? So most of your pages are showing 404 errros.
Intermediate & Advanced SEO | | summer3000 -
Error reports showing pages that don't exist on website
I have a website that is showing lots of errors (pages that cannot be found) in google webmaster tools. I went through the errors and re-directed the pages I could. There are a bunch of remaining pages that are not really pages this is why they are showing errors. What's strange is some of the URL's are showing feeds which these were never created. I went into Google webmaster tools and looked at the remove URL tool. I am using this but I am confused if I need to be selecting "remove page from search results and cache" option or should I be selecting this other option "remove directory" I am confused on the directory. I don't want to accidentally delete core pages of the site from the search engines. Can anybody shed some light on this or recommend which I should be selecting? Thank you Wendy
Intermediate & Advanced SEO | | SOM240 -
Best way to fix 404 crawl errors caused by Private blog posts in WordPress?
Going over Moz Crawl error report and WMT's Crawl errors for a new client site... I found 44 High Priority Crawl Errors = 404 Not Found I found that those 44 blog pages were set to Private Mode (WordPress theme), causing the 404 issue.
Intermediate & Advanced SEO | | SEOEND
I was reviewing the blog content for those 44 pages to see why those 2010 blog posts, were set to private mode. Well, I noticed that all those 44 blog posts were pretty much copied from other external blog posts. So i'm thinking previous agency placed those pages under private mode, to avoid getting hit for duplicate content issues. All other blog posts posted after 2011 looked like unique content, non scraped. So my question to all is: What is the best way to fix the issue caused by these 44 pages? A. Remove those 44 blog posts that used verbatim scraped content from other external blogs.
B. Update the content on each of those 44 blog posts, then set to Public mode, instead of Private.
C. ? (open to recommendations) I didn't find any external links pointing to any of those 44 blog pages, so I was considering in removing those blog posts. However not sure if that will affect site in anyway. Open to recommendations before making a decision...
Thanks0 -
URL errors in Google Webmaster Tool
Hi Within Google Webmaster Tool 'Crawl errors' report by clicking 'Not found' it shows 404 errors its found. By clicking any column headings and it will reorder them. One column is 'Priority' - do you think Google is telling me its ranked the errors in priority of needing a fix? There is no reference to this in the Webmaster tool help. Many thanks Nigel
Intermediate & Advanced SEO | | Richard5551 -
Fixing Duplicate Content Errors
SEOMOZ Pro is showing some duplicate content errors and wondered the best way to fix them other than re-writing the content. Should I just remove the pages found or should I set up permanent re-directs through to the home page in case there is any link value or visitors on these duplicate pages? Thanks.
Intermediate & Advanced SEO | | benners0 -
Have you ever seen this 404 error: 'www.mysite.com/Cached' in GWT?
Google webmaster tools just started showing some strange pages under "not found" crawl errors. www.mysite.com/Cached www.mysite.com/item-na... <--- with the three dots, INSTEAD of www.mysite.com/item-name/ I have just 301'd them for now, but is this a sign of a technical issue? The site is php/sql and I'm doing the URL rewrites/301s etc in .htaccess. Thanks! -Dan EDIT: Also, wanted to add, there is no 'linked to' page.
Intermediate & Advanced SEO | | evolvingSEO0