Server down - What will happen to the SERP?
-
Hi everybody, we have a lot of websites (about 100) on one Server in Italy. This Server crashed 5 days ago and now it should go online (I hope!).
What will happen to the SERP? What shall I do to recover the rank of every key? New links, new content, just wait...what?
Tnks
-
Samuele-
Sorry to hear that the server was down for five days. That's not a lot of fun to deal with.
Here's a great link to an interview by Matt Cutts about Website downtime and how it affects Google rankings:
Basically, if the server is down for a day, you're fine. More downtime than that, though, and Google sees this as a signal that the user experience might not be amazing.
My recommendations to prevent this from happening again:
1. Back up each site to at least two locations. In the case that a server like this goes down, you can migrate the sites to a different host quickly. Make sure you have the database + the site contents backed up. I recommend backing up locally to a hard drive, as well as another cloud-based site, too. For a critical site, it may be worth having a "hot" backup site that can be pushed live via DNS quickly and easily. For a site with 100+ sites on it, make sure you have a list of all of the sites on the server, and order them in importance of what is most critical to work on first.
2. Make sure that the DNS doesn't route through the server with 100+ sites on it. Use the DNS controls at your domain name registrar, instead. That way you can quickly re-route the DNS to go to a new hosting platform, and not have to wait for the server to come back up.
3. Consider moving to a different hosting platform that has more uptime / reliability. 100+ sites on a server is a lot of eggs in a basket. (Note: I've done something similar in the past, and it's not worth it to have all of those sites sitting together on a shared server. Better to break them up and put them on several different servers if possible.)
Finally, once the site is back up, I'd try to bolster the importance of the site with additional relevant content, inbound links, social media, etc. I might suggest a permission-based email campaign to past customers to bring them back, and let them know about the site outage.
Hope this helps...
-- Jeff
-
Yup, just wait. However, I would consider switching to a better server, a 5 day downtime is a long downtime! Look for more reliable solution.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Will pillar posts create a duplication content issue, if we un-gate ebook/guides and use exact copy from blogs?
Hi there! With the rise of pillar posts, I have a question on the duplicate content issue it may present. If we are un-gating ebook/guides and using (at times) exact copy from our blog posts, will this harm our SEO efforts? This would go against the goal of our post and is mission-critical to understand before we implement pillar posts for our clients.
White Hat / Black Hat SEO | | Olivia9540 -
Wordpress Category Archives - Index - but will this cause duplication?
Okay something I am struggling with Using YOAST - but have a recipe blog - However the category archives have /are being optimized and indexed as I am adding custom content to them , then listing the recipes below. My question is if I am indexing the Category Archives and using these to add custom content above - then allows the recipe excerpts from the category to be listed underneath - will these recipe excerpts be picked up as duplicate content?
White Hat / Black Hat SEO | | Kelly33300 -
IT want to do a name server redirect
Hi, I am in a little bit of a pickle, and hope that you clever people can help me... A little background: In April this year we relaunched one of our brands as a standalone business. I set up page to page 301 redirects from the old website to the new branded domain. From an SEO perspective this relaunch went amazingly smoothly - we only lost around 10% of traffic and that was just for a couple of months. We now get more traffic than ever before. Basically it's all going swimmingly. I noticed yesterday that the SSL certificate on the old domain has expired, so I asked IT to repurchase one for us to maintain the 301 redirects. IT are saying that they would prefer to do a name server redirect instead, which would remove all the page to page 301s. They are saying that this would maintain the SEO. As far as I am aware this wouldn't. Please can someone help me put together a polite but firm response to basically say no? Thanks, I really welcome and appreciate your help on this! Amelia
White Hat / Black Hat SEO | | CommT0 -
Controlling crawl speed/delay through dynamic server-code and 503's
Lately i'm experiencing performance trouble caused by bot traffic. Although Googlebot is not the worst (it's mainly bingbot and ahrefsbot), they cause heavy server load from time to time. We run a lot of sites on one server, so heavy traffic on one site impacts other site's performance. Problem is that 1) I want a centrally managed solution for all sites (per site administration takes too much time), which 2) takes into account total server-load in stead of only 1 site's traffic and 3) controls overall bot-traffic in stead of controlling traffic for one bot. IMO user-traffic should always be prioritized higher than bot-traffic. I tried "Crawl-delay:" in robots.txt, but Googlebot doesn't support that. Although my custom CMS system has a solution to centrally manage Robots.txt for all sites at once, it is read by bots per site and per bot, so it doesn't solve 2) and 3). I also tried controlling crawl-speed through Google Webmaster Tools, which works, but again it only controls Googlebot (and not other bots) and is administered per site. No solution to all three of my problems. Now i came up with a custom-coded solution to dynamically serve 503 http status codes to a certain portion of the bot traffic. What traffic-portion for which bots can be dynamically (runtime) calculated from total server load at that certain moment. So if a bot makes too much requests within a certain period (or whatever other coded rule i'll invent), some requests will be answered with a 503 while others will get content and a 200. Remaining question is: Will dynamically serving 503's have a negative impact on SEO? OK, it will delay indexing speed/latency, but slow server-response-times do in fact have a negative impact on the ranking, which is even worse than indexing-latency. I'm curious about your expert's opinions...
White Hat / Black Hat SEO | | internetwerkNU1 -
How will Google deal with the crosslinks for my multiple domain site
Hi, I can't find any good answer to this question so I thought, why not ask Moz.com ;-)! I have a site, let's call it webshop.xx For a few languages/markets, Deutsch, Dutch & Belgian, English, French. I use a different TLD with a different IP for each of these languages, so I'll end up with: webshop.de, webshop.nl, webshop.be, webshop.co.uk, webshop.com & webshop.fr They all link to eachother and every subpage that is translated from the other site gets a link as well from the other languages, so: webshop.com/stuff links to webshop.de/stuff My main website, webshop.com gets links from every other of these domain which Open Site Explorer as well as Majestic SEO sees as an external link. (this is happening) My question. How will Google deal in the long-run with the crosslinks coming from these domains? some guesses I made: I get full external links juice (content is translated so unique?) I get a bit of the juice of an external link They are actually seen as internal links I'll get a penalty Thanks in advance guys!!!
White Hat / Black Hat SEO | | pimarketing0 -
Old SPAM tactic still works and gets TOP 3 in SERP?
Hi Mozers, Below you can see some examples of spam ( hidden text and sneaky redirects) which are in SERP for our branded keywords during last 3 months. Some of them occupy very high position in SERP (top 3/top5). https://www.google.com/search?num=100&newwindow=1&safe=off&biw=1883&bih=1028&q=%22your+mac+-%22%2B%22cleanmymac%22 I sent spam reports and I’m going to continue doing so. (~500 spam reports from personal and work google account) I contacting directly with some of the hacked sites (web-masters) and tried to help them to fix this issue, but it takes a lot of my time. But 3 months!? Can you give me any advice, what doing next? Thank you!
White Hat / Black Hat SEO | | MacPaw0 -
Beating the file sharing sites in SERPs - Can it be done and how?
Hi all, A new client of mine is an online music retailer (CD, vinyls, DVD etc) who is struggling against file sharing sites that are taking precedence over the client's results for searches like "tropic of cancer end of things cd" If a site a legal retailer trying to make an honest living who's then having to go up against the death knell of the music industry - torrents etc. If you think about it, with all the penalties Google is fond of dealing out, we shouldn't even be getting a whiff of file sharing sites in SERPs, right? How is it that file sharing sites are still dominating? Is it simply because of the enormous amounts of traffic they receive? Does traffic determine ranking? How can you go up against torrents and download sites in this case. You can work on the onsite stuff, get bloggers to mention the client's pages for particular album reviews, artist profiles etc, but what else could you suggest I do? Thanks,
White Hat / Black Hat SEO | | Martin_S0 -
Server Side Java Script Redirects
I would like to use a redirect through a server based Java script to redirect visitors only referenced from a certain site. So let's say anyone clicking on a link to my site page-A from seomoz.org would automatically be redirected to page-B. All other users as well as direct and search engine traffic would only see the regular page A. The reason I am doing this is because the linking site is linking to page A which doesn't serve the user the correct content. Rather than contacting the webmaster to change the link to point to page -B, I want to redirect them. Is there any danger of Google penalizing this for cloaking? and how would they be able to tell?
White Hat / Black Hat SEO | | zachc_coffeeforless.com0