Database Crash affecting our rankings
-
Here is one for you. We had a database crash, and was down for an hour and 15 minutes, on Dec 18th. During this time our system automatically sent out a 301 redirect to our product home page. (iboats.com/boat-parts-accessories/dm/)
On Dec 24th I noticed a huge ranking drop on a lot of our keywords, and some keywords that were serving up the redirect page from our database crash, in place of its normally ranked page.
So my assumption is that Google was crawling our site during the database crash and caching the redirect pages in place of our regularly ranked pages. So when a search query is made that would normally display our ranked page, it is now displaying the redirect page from the crawled cache.
And on other keywords where we normally would rank for, Google no longer has that page cached, so it drops it from ranking.
My question: Does this assumption sound accurate? If so, I'm assuming over time that our past ranked pages will show up once again after Google recrawls those pages and saves them in their cache.
I have several sites that were affected with this:
Thanks in advance for your input
-
Glad to hear you appear to be back on the right track
-
Update:
After unblocking Google, almost all of our rankings are back up, and in some cases a little improved. I was surprised how quickly Google recrawled and ranked the pages that dropped.
I did do a Fetch and submit on all my websites and pages. I'm not sure how Fetches are attended to by Google, but it appears they are.
I have an IT guy who doesn't feel a Fetch makes any difference. He says Google will crawl when they want to and probably does not pay attention to the Fetch request.
-
I wanted to get back to report the cause of our problem with a lot of our rankings dropping. We originally thought it was due to a database crash. And that still could be affecting some things.
We moved our servers to a new location. A couple of weeks ago, we switched over to some new IP addresses. When we did so, a program started blocking Google's IP addresses because they were hitting our servers with a lot of hits, and put those IP addresses on a blocked list. So our pages on some of the websites stopped getting crawled, and thus rankings dropped. We discovered the problem when exploring our Webmaster Tools Crawl Stats and noticed that the "kilobytes downloaded per day" had flat-lined. Doing a fetch on selected pages we could see that it was not fetching the page. So after unblocking those IP addresses, the pages got crawled, our crawl stats went back up, and our rankings started going back to where they were. We still have some rankings that have not bounced back, but assume they will as new crawls re-index those pages.
A lesson learned for me was to pay attention to the crawl stats. Any drastic variations could be a sign of a problem.
Brad
iboats.com -
I'm pretty sure that GWT is a little vague in terms of times that Googlebot visited with just the crawlrate shown on a day by day basis and there's some lag between now and when it happened. CPanel will show when spiders visited in latest visitors from memory, however I think that rather than looking back it would be better to simply ensure that this doesn't happen again with the 302 rather than the 301. Maybe refreshing of your sitemap submissions in GWT would help to get some crawling going on with a ping or two to get Googlebot back soon.
-
Also, can't we use GWT to find out the exact times or at least approximate times of when Google crawled a site? Or can we not find that information so precisely? In this case, it does seem that the 301 vs. 302 is the key problem, but also the fact that the system for getting rid of the 301 didn't work out properly.
-
It seems extreme that Google would react within an hour and 15 mins but who knows nowadays. Keep an eye on your GWT account and hopefully if you have a decent crawl rate things should remedy themselves. Stephen Salstrand made a good point. 302's rather than 301's should help. It's great to have something in place like a redirection for user experience in case of database failure, however, as Stephen said, it should be made clear to Google that it's only temporary.
-
Investigating it further, we have some permanent redirects that are still hanging. I've got my IT guys working on it as we speaking.
I've got to run, but I would like to revisit this soon. Once we have determined the problem, I will post it here.
Thanks for your opinions. They have helped.
-
Your assumptions are globally right. However, I have never seen Google react that quickly to a 301 redirect or even a crash, I always tell my customers "if you have a temporary issue with 404 and 500 erros, don't worry about it too much, Google won't react unless it is a permanent issue that lasts over three or four crawls".
This is why I would also investigate a penalty that just happened to take effect shortly after the crash. To rule that out, could you please tell us if some of your major keywords are still ranking well?
I don't think you sould worry too much about that, just wait a few days and see if some pages recover.
-
Good point on the 301's. I'll look into that.
Thanks, for your input!
-
I think your assumptions are on par. However, make sure to review that the 301 has been removed. Also, if you still want that auto redirect to happen on a crash, make it a 302 and not a 301 as a 301 will suck a lot of life out of your site (since SEs see it as permanent).
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can we re rank our Penalyzed website in Google?
Hello This is Maqbul, from India. I have a jobs portal blog [ bharatrecruit.com]. It was getting around 50K to 100K Views a Day and made me $100 a day. But after a few months, my competitor made negative SEO with 12,000 Spammy backlinks. Suddenly my site was hit by Google and now it is getting 200 to 300 Pageviews a day. So the question is I did not disavow bad links for a long time like 3 to 4 months. Now I disavow all the bad links but the website is not ranking. Can we re-rank this site or create another website. Please reply must. None of the bloggers can answer this. Thanks, Regards Maqbul
Technical SEO | | vinaso960 -
CSS user select and any potential affect on SEO
Hi everyone and thank you in advance for your helpful comments. We have a client who is concerned about copying of content from their site because it has happened a few times in the last few years. We have explained that the content is essentially publicly available and that using the CSS selector user-select to prevent selection of text will really only prevent the technically limited users from working out how to get the text. He is happy that it will at least stop some people. So the question is would there be any way that this would have an affect on SEO? We would make an assumption that it doesnt but putting it out there for some feedback. Cheers Eddie
Technical SEO | | vital_hike0 -
Awful ranking after site redesign
Hello everyone, I have a situation here and I’d like to have your opinion about it. I am working on a site which has been recently redesigned from scratch: in a nutshell, as soon as the new site went live their rankings dropped and, of course, so did their visitors and so on.. The guys who redesigned the site didn’t do any 301 redirect whatsoever, so now the old pages are just 404s and blocked by robots. My question is: if they 301 redirect now, do you think it would be possible they could get their rankings back? One more thing: when they launched the new site, the indexed pages basically doubled overnight; there were 700 and now there are 1400. Do you think this could affect their ranking as well? Thank you for you insights 🙂 Elio
Technical SEO | | Eyah0 -
Reducing Alexa Ranking
I have a site mexat, it's current Alexa Rank is 4,497. So, I want to reduce alexa for this site upto 1500 in One Month. I have done following activities- Install Alexa toolbar on my computer. 2)Put Alexa Header code in Source code of my site. 3)I have a big traffic on daily basis upto 1500 visitors. 4)I am posting Blogs, Social Bookmarking, web 2.0 Profile Creation on daily basis. But I didn't get good result. So please share me some Useful Strategy for this project.
Technical SEO | | afycon0 -
How to improve ranking of a website again, after being penalized by Google?
The ranking of our website has gone down in past 2 months. The reason,
Technical SEO | | TGA123
I believe is that we had more than 300,000 spammy comments posted on it
(the website is based on wordpress) so Google treated it as
un-monitored forum and penalized. We have deleted the older comments
and new comments can no longer be posted. Need suggestions on what else
should we do to rank better. Any advice would be very welcome.0 -
Server crashed - What should I do regarding Google SERP´s?
We have several travel websites in Uruguay since 2003. These sites have a very high PR and Trust. The server where all our sites are hosted has crashed and we have been for 2 days now trying to fix all this mess. We hope this problem will be fixed today. Please I really need to know what should I do regarding Google. I mean one of our sites has been ranking in top 1 positions for more than 150 keywords. Will we loose all that? What can we do about it? It´s the first time this has happened to our sites.
Technical SEO | | ceci27100 -
Optimizing a website which uses information from a database
Hi, Sorry if this question is very general. I am in the process of building a website for local business, and it it will be heavily dependent on a database. The database will contain loads of information which can be optimized such as business directory listings, articles, forums, questions and answers etc. The businesses will also be able to link to and from the site. Which is the best way to display this information so that it can be optimized the best? I was going to use drop down boxes on a single page, ie main category, sub catagory, then display the business listings based on this. However, as the information on the page changes constantly based on the drop down the robot / user uses, I am assuming this is very hard to get optimized well. Does anyone know a better way? Thanks.
Technical SEO | | PhatJP0 -
International Site, flow of page rank?
OK. I'm working on an international site. The site is setup with folders for UK, US, AU e.g www.site.com/UK/index.aspx The root (non folder based) is the international version of the site e.g www.site.com/index.aspx www.site.com/index.aspx has the lions share of links. Therefore, the pages immediately linked from www.site.com/index.aspx have page rank distributed between them. My UK, US and AU home pages are linked via a country selector from the www.site.com/index.aspx page via an aspx redirect page that 301's to the appropriate country home page. Therefore the home pages of UK, US, AU are recieving some of the 'juice' that is coming in to www.site.com/index.aspx (but only a fraction via the redirect links) Am I right in thinking that pages on the international version of the site will have much more potential to rank (because of their 'juice') than the pages on UK, US and AU versions of the site? If so, am I right in thinking that these will tend to rank over the equivalent UK, US and AU versions of the pages in each country version of Google despite having set directory level Geo-targetting in GWT?
Technical SEO | | QubaSEO1