Instability on the server - punishment in Google?
-
My site has about 50k pages indexed in Google. We are well respected, and we believe our stories add much value to the user. However, we are having serious problems with my hosting service and the site is unavailable for more than 15 hours.
Unfortunately we also have no provision for him to return to work. I wonder if this kind of instability can cause some punishment on Google, if so, I wonder if there is anything we can do to tell Google that we are aware and working to resolve the problem.
-
I still don't think you will be harmed by Google. Keri posted a good link below regarding a down website. You should give it a read. I did.
-
Hi Clovis,
This recent YouMoz post has some recommendations of what to do when your site is down. Not sure if you'll be able to use them, but it's worth a read. Best of luck!
http://www.seomoz.org/blog/how-to-handle-downtime-during-site-maintenance
-
Thanks, for you attention!
-
I don't understand when you say Unfortunately wealso have no provision for him to return to work.
I'm sorry, I'm Brazilian, don't speak english very well.
I wanted to say that time is not a prediction for the site to return to work. We have a backup but it is very old, 3 months ago. This could also be harmful?
-
Sounds like a tough situation. Once your site gets back online, visit your Google Webmaster tools to see what they are saying in there. You might be able to reply with a message telling them what happened.
As long as you can get your site back online, you might only see a drop (if any) for a few days.
Good luck!
-
Clovis
You state:** However, we are having serious problems with my hosting service and the site is unavailable for more than 15 hours.**
Do you have a plan to have the site back up shortly? Next few days even?
** I wonder if this kind of instability can cause some punishment on Google**
Assuming it is not a regular occurrence and the site will be back up in the next day or two even, I do not believe it will be a problem with Google. One thing you could do as soon as it is back up is request to have it reindexed in GWMT. That just gets anything that may have been missed addressed.
I don't understand when you say **Unfortunately we also have no provision for him to return to work. **
Are you meaning the server or an employee who in some way has caused the problem? Again, everything will center around your timetable with getting the site back up. If you can fill in some of the details it will help mozzers in assisting you.
best
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Why is this site ranked #1 in Google with such a low DA (is DA not important anymore?)
Hi Guys, Would you mind helping me with the below please? I would like to get your view on it and why Google ranks a really new domain name #1 with super low domain authority? Or is Domain Authority useless now in Google? It seems like from the last update that John Mueller said that they do not use Domain Authority so is Moz Domain Authority tool not to take seriously or am I missing something? There is a new rehab in Thailand called https://thebeachrehab.com/ (Domain authority 13)It's ranked #1 in Google.co.th for these phrases: drug rehab thailand but also for addiction rehab thailand. So when checking the backlink profile it got merely 21 backlinks from really low DA sites (and some of those are really spammy or not related). Now there are lots of sites in this industry here which have a lot higher domain authority and have been around for years. The beach rehab is maybe only like 6 months old. Here are three domains which have been around for many years and have much higher DA and also more relevant content. These are just 3 samples of many others... <cite class="iUh30">https://www.thecabinchiangmai.com (Domain Authority 52)</cite>https://www.hope-rehab-center-thailand.com/ (Domain Authority 40)https://www.dararehab.com (Domain Authority 32) These three sites got lots of high DA backlinks (DA 90++) from strong media links like time.com, theguardian.com, telegraph.co.uk etc. (especially thecabinchiangmai.com) but the other 2 got lots of solid backlinks from really high DA sites. So when looking at the content, thebeachrehab.com has less content as well. Can anyone have a look and let me know your thoughts why Google picks a brand new site, with DA 13 and little content in the top compared to competition? I do not see the logic in this? Cheers
White Hat / Black Hat SEO | | igniterman75
John0 -
What does Google's Spammy Structured Markup Penalty consist of?
Hey everybody,
White Hat / Black Hat SEO | | klaver
I'm confused about the Spammy Structured Markup Penalty: "This site may not perform as well in Google results because it appears to be in violation of Google's Webmaster Guidelines." Does this mean the rich elements are simply removed from the snippets? Or will there be an actual drop in rankings? Can someone here tell from experience? Thanks for your help!1 -
Third part http links on the page source: Social engineering content warning from Google
Hi, We have received "Social engineering content" warning from Google and one of our important page and it's internal pages have been flagged as "Deceptive site ahead". We wonder what's the reason behind this as Google didn't point exactly to the specific part of the page which made us look so to the Google. We don't employ any such content on the page and the content is same for many months. As our site is WP hosted, we used a WordPress plugin for this page's layout which injected 2 http (non-https) links in our page code. We suspect if this is the reason behind this? Any ideas? Thanks
White Hat / Black Hat SEO | | vtmoz1 -
Google Algorithm non-manual penalty. How do we fix this (quality?) drop?
Hi, See attached image. We received a non-manual penalty on March 22, 2015. I don't think we ever came out of it. We have moved up due to the Penguin update, but we should (by DA PA) be up on the first page for tons of stuff and most keyword are lower than their true strength. What kind of quality errors could be causing this? I assume it was a quality update. I am working on the errors, but don't see anything that would be so severe as to be penalized. What errors/quality problems am I looking for? We have tons of unique content. Good backlinks. Good design. Good user experience except for some products. Again, what am I looking for? Thanks. non-manual-penalty.png
White Hat / Black Hat SEO | | BobGW0 -
How does Google determine if a link is paid or not?
We are currently doing some outreach to bloggers to review our products and provide us with backlinks (preferably followed). The bloggers get to keep the products (usually about $30 worth). According to Google's link schemes, this is a no-no. But my question is, how would Google ever know if the blogger was paid or given freebies for their content? This is the "best" article I could find related to the subject: http://searchenginewatch.com/article/2332787/Matt-Cutts-Shares-4-Ways-Google-Evaluates-Paid-Links The article tells us what qualifies as a paid link, but it doesn't tell us how Google identifies if links were paid or not. It also says that "loans" or okay, but "gifts" are not. How would Google know the difference? For all Google knows (maybe everything?), the blogger returned the products to us after reviewing them. Does anyone have any ideas on this? Maybe Google watches over terms like, "this is a sponsored post" or "materials provided by 'x'". Even so, I hope that wouldn't be enough to warrant a penalty.
White Hat / Black Hat SEO | | jampaper0 -
Controlling crawl speed/delay through dynamic server-code and 503's
Lately i'm experiencing performance trouble caused by bot traffic. Although Googlebot is not the worst (it's mainly bingbot and ahrefsbot), they cause heavy server load from time to time. We run a lot of sites on one server, so heavy traffic on one site impacts other site's performance. Problem is that 1) I want a centrally managed solution for all sites (per site administration takes too much time), which 2) takes into account total server-load in stead of only 1 site's traffic and 3) controls overall bot-traffic in stead of controlling traffic for one bot. IMO user-traffic should always be prioritized higher than bot-traffic. I tried "Crawl-delay:" in robots.txt, but Googlebot doesn't support that. Although my custom CMS system has a solution to centrally manage Robots.txt for all sites at once, it is read by bots per site and per bot, so it doesn't solve 2) and 3). I also tried controlling crawl-speed through Google Webmaster Tools, which works, but again it only controls Googlebot (and not other bots) and is administered per site. No solution to all three of my problems. Now i came up with a custom-coded solution to dynamically serve 503 http status codes to a certain portion of the bot traffic. What traffic-portion for which bots can be dynamically (runtime) calculated from total server load at that certain moment. So if a bot makes too much requests within a certain period (or whatever other coded rule i'll invent), some requests will be answered with a 503 while others will get content and a 200. Remaining question is: Will dynamically serving 503's have a negative impact on SEO? OK, it will delay indexing speed/latency, but slow server-response-times do in fact have a negative impact on the ranking, which is even worse than indexing-latency. I'm curious about your expert's opinions...
White Hat / Black Hat SEO | | internetwerkNU1 -
Does Google Consider a Follow Affiliate Link into my site a paid link?
Let's say I have a link coming into my domain like this http://www.mydomain.com/l/freerol.aspx?AID=674&subid=Week+2+Freeroll&pid=120 Do you think Google recognizes this as paid link? These links are follow links. I am working on a site that has tons of these, but ranks fairly well. They did lose some ranking over the past month or so, and I am wondering if it might be related to a recent iteration of Penguin. These are very high PR inbound links and from a number of good domains, so I would not want to make a mistake and have client get affiliates to no follow if that is going to cause his rankings to drop more. Any thoughts would be appreciated.
White Hat / Black Hat SEO | | Robertnweil10 -
Is Google now punishing anchor text?
Hi All, I was just wondering if Google is starting to punish anchor text links? I've noticed that one of my clients domains has slightly reduced and they have slipped a few places in rankings for a key term since. I found this bizarre as the last few links I built were both relevant and strong but I did use an anchor text? Any feedback would be useful, I'm slightly confused here?
White Hat / Black Hat SEO | | Benjamin3790