Server down - What will happen to the SERP?
-
Hi everybody, we have a lot of websites (about 100) on one Server in Italy. This Server crashed 5 days ago and now it should go online (I hope!).
What will happen to the SERP? What shall I do to recover the rank of every key? New links, new content, just wait...what?
Tnks
-
Samuele-
Sorry to hear that the server was down for five days. That's not a lot of fun to deal with.
Here's a great link to an interview by Matt Cutts about Website downtime and how it affects Google rankings:
Basically, if the server is down for a day, you're fine. More downtime than that, though, and Google sees this as a signal that the user experience might not be amazing.
My recommendations to prevent this from happening again:
1. Back up each site to at least two locations. In the case that a server like this goes down, you can migrate the sites to a different host quickly. Make sure you have the database + the site contents backed up. I recommend backing up locally to a hard drive, as well as another cloud-based site, too. For a critical site, it may be worth having a "hot" backup site that can be pushed live via DNS quickly and easily. For a site with 100+ sites on it, make sure you have a list of all of the sites on the server, and order them in importance of what is most critical to work on first.
2. Make sure that the DNS doesn't route through the server with 100+ sites on it. Use the DNS controls at your domain name registrar, instead. That way you can quickly re-route the DNS to go to a new hosting platform, and not have to wait for the server to come back up.
3. Consider moving to a different hosting platform that has more uptime / reliability. 100+ sites on a server is a lot of eggs in a basket. (Note: I've done something similar in the past, and it's not worth it to have all of those sites sitting together on a shared server. Better to break them up and put them on several different servers if possible.)
Finally, once the site is back up, I'd try to bolster the importance of the site with additional relevant content, inbound links, social media, etc. I might suggest a permission-based email campaign to past customers to bring them back, and let them know about the site outage.
Hope this helps...
-- Jeff
-
Yup, just wait. However, I would consider switching to a better server, a 5 day downtime is a long downtime! Look for more reliable solution.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
.com geotagging redirect to subdomains - will it affect SEO?
Hi guys, We have a .com domain and we've got geoIP on it, so UK goes to .co.uk and USA goes to .com/us We're just migrating over to another platform so we're thinking of keeping a "dummy" server just to do this geoIP pointing for us. Essentially .com will just point over to the right place and hold a specific .com/abc (which is generic for everyone worldwide) Current Scenario:
White Hat / Black Hat SEO | | Infruition
.com (Magento + geoIP)
.com/us (US Magento)
.co.uk (UK - geoIP redirect to Shopify)
.com/abc (sits on Magento server) Wanted Scenario:
.com - used for GEOIP and a specific .com/abc (for all users)
.co.uk (UK) - Shopify eCom
.com/us -> migration to us.xx.com (USA) - Shopify eCom I just wanted to know if this will affect our rankings on google? Also, any advice as to the best practises here would be great. Thanks! Nitesh0 -
Mobile SERP Thumbnail Image Control
Is there any way we can control the image that is selected next to the mobile serps? What google selects for the mobile serp thumbnail on a few of our serps is not conducive to high CTR.
White Hat / Black Hat SEO | | gray_jedi1 -
We lost 60-70% of our organic traffic but no penalty - what happened?
Hi Mozzers! Need some help/advice I’m running a sports betting site – superbetting.com and around 16-19<sup>th</sup> may our organic traffic suddenly dropped with 60-70% or so and ever since we’ve been struggling trying to find the cause and not least, been trying to do something about. A few observations / thoughts; It seems we’ve suddenly have quite a few inbound links from Russia without promoting our content / site towards Russian users. Neither do we have any Russian content. Should we disavow those links and/or try to contact the sites to get our link removed? Looking in ahrefs, I can see that anchors also suddenly are dominated by Russian. Maybe obvious given the above but still strange … We have struggled with spammers trying to deploy link in our forum and have just recently removed them ( or at least we think we have) but could those bad links been hurting us over time? Google ran an algo update in may regarding “quality signals” and I full aware that our site may not be top-notch but I can’t belief that should have hit us that hard since I (and I may be biased :)) would say that there are far lousier sites ranking better than us now than before. Any feedback would be appreciated Thanks! Mike
White Hat / Black Hat SEO | | skjorte19740 -
Controlling crawl speed/delay through dynamic server-code and 503's
Lately i'm experiencing performance trouble caused by bot traffic. Although Googlebot is not the worst (it's mainly bingbot and ahrefsbot), they cause heavy server load from time to time. We run a lot of sites on one server, so heavy traffic on one site impacts other site's performance. Problem is that 1) I want a centrally managed solution for all sites (per site administration takes too much time), which 2) takes into account total server-load in stead of only 1 site's traffic and 3) controls overall bot-traffic in stead of controlling traffic for one bot. IMO user-traffic should always be prioritized higher than bot-traffic. I tried "Crawl-delay:" in robots.txt, but Googlebot doesn't support that. Although my custom CMS system has a solution to centrally manage Robots.txt for all sites at once, it is read by bots per site and per bot, so it doesn't solve 2) and 3). I also tried controlling crawl-speed through Google Webmaster Tools, which works, but again it only controls Googlebot (and not other bots) and is administered per site. No solution to all three of my problems. Now i came up with a custom-coded solution to dynamically serve 503 http status codes to a certain portion of the bot traffic. What traffic-portion for which bots can be dynamically (runtime) calculated from total server load at that certain moment. So if a bot makes too much requests within a certain period (or whatever other coded rule i'll invent), some requests will be answered with a 503 while others will get content and a 200. Remaining question is: Will dynamically serving 503's have a negative impact on SEO? OK, it will delay indexing speed/latency, but slow server-response-times do in fact have a negative impact on the ranking, which is even worse than indexing-latency. I'm curious about your expert's opinions...
White Hat / Black Hat SEO | | internetwerkNU1 -
I am tempted to purchase a listing on an industry specific website directory with high domain authority. Will that be frowned upon as buying links?
I am tempted to purchase a listing on an industry specific website directory (http://marketingresourcedirectory.ama.org/) with high domain authority. Will that be frowned upon as buying links?
White Hat / Black Hat SEO | | SearchParty0 -
Beating the file sharing sites in SERPs - Can it be done and how?
Hi all, A new client of mine is an online music retailer (CD, vinyls, DVD etc) who is struggling against file sharing sites that are taking precedence over the client's results for searches like "tropic of cancer end of things cd" If a site a legal retailer trying to make an honest living who's then having to go up against the death knell of the music industry - torrents etc. If you think about it, with all the penalties Google is fond of dealing out, we shouldn't even be getting a whiff of file sharing sites in SERPs, right? How is it that file sharing sites are still dominating? Is it simply because of the enormous amounts of traffic they receive? Does traffic determine ranking? How can you go up against torrents and download sites in this case. You can work on the onsite stuff, get bloggers to mention the client's pages for particular album reviews, artist profiles etc, but what else could you suggest I do? Thanks,
White Hat / Black Hat SEO | | Martin_S0 -
Dramatic fall in SERP's for all keywords at end of March 2012?? Help!
Hi, Our website www.photoworld.co.uk has been improving it's SERP's for the last 12 months or so, achieving page 1 rankings for most of our key terms. Then suddenly, around the end of March, we suffered massive drops in nearly all of our key terms (see attached image for more info). Basically I wondered if anyone had any clues on what Google has suddenly taken a huge dislike to with our site and steps we can put in place to aid with rankings recovery ASAP. Thanks n8taO.jpg
White Hat / Black Hat SEO | | cewe0 -
Server Side Java Script Redirects
I would like to use a redirect through a server based Java script to redirect visitors only referenced from a certain site. So let's say anyone clicking on a link to my site page-A from seomoz.org would automatically be redirected to page-B. All other users as well as direct and search engine traffic would only see the regular page A. The reason I am doing this is because the linking site is linking to page A which doesn't serve the user the correct content. Rather than contacting the webmaster to change the link to point to page -B, I want to redirect them. Is there any danger of Google penalizing this for cloaking? and how would they be able to tell?
White Hat / Black Hat SEO | | zachc_coffeeforless.com0