Instability on the server - punishment in Google?
-
My site has about 50k pages indexed in Google. We are well respected, and we believe our stories add much value to the user. However, we are having serious problems with my hosting service and the site is unavailable for more than 15 hours.
Unfortunately we also have no provision for him to return to work. I wonder if this kind of instability can cause some punishment on Google, if so, I wonder if there is anything we can do to tell Google that we are aware and working to resolve the problem.
-
I still don't think you will be harmed by Google. Keri posted a good link below regarding a down website. You should give it a read. I did.
-
Hi Clovis,
This recent YouMoz post has some recommendations of what to do when your site is down. Not sure if you'll be able to use them, but it's worth a read. Best of luck!
http://www.seomoz.org/blog/how-to-handle-downtime-during-site-maintenance
-
Thanks, for you attention!
-
I don't understand when you say Unfortunately wealso have no provision for him to return to work.
I'm sorry, I'm Brazilian, don't speak english very well.
I wanted to say that time is not a prediction for the site to return to work. We have a backup but it is very old, 3 months ago. This could also be harmful?
-
Sounds like a tough situation. Once your site gets back online, visit your Google Webmaster tools to see what they are saying in there. You might be able to reply with a message telling them what happened.
As long as you can get your site back online, you might only see a drop (if any) for a few days.
Good luck!
-
Clovis
You state:** However, we are having serious problems with my hosting service and the site is unavailable for more than 15 hours.**
Do you have a plan to have the site back up shortly? Next few days even?
** I wonder if this kind of instability can cause some punishment on Google**
Assuming it is not a regular occurrence and the site will be back up in the next day or two even, I do not believe it will be a problem with Google. One thing you could do as soon as it is back up is request to have it reindexed in GWMT. That just gets anything that may have been missed addressed.
I don't understand when you say **Unfortunately we also have no provision for him to return to work. **
Are you meaning the server or an employee who in some way has caused the problem? Again, everything will center around your timetable with getting the site back up. If you can fill in some of the details it will help mozzers in assisting you.
best
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Does building backlinks help improve Google rankings? If so which links work nowadays?
Hi Guys, Please only reply of you have real experience.... So as the title implies does building backlinks work in improving the rankings in Google? I know they are not on the same level as some are spammy, in blog networks etc but how about other backlinks that are of higher quality? If yes, what sorts of backlinks work nowadays in boosting rankings but not risking getting penalized? So should you build backlinks ongoing? If so how many per month? I have a real struggle trying to get backlinks on really high quality sites. Any advice? Cheers John
White Hat / Black Hat SEO | | whiteboardwiz1 -
How does Google handle product detail page links hiden in a <noscript>tag?</noscript>
Hello, During my research of our website I uncovered that our visible links to our product detail pages (PDP) from grid/list view category-nav/search pages are <nofollowed>and being sent through a click tracking redirect with the (PDP) appended as a URL query string. But included with each PDP link is a <noscript>tag containing the actual PDP link. When I confronted our 3rd party e-commerce category-nav/search provider about this approach here is the response I recieved:</p> <p style="padding-left: 30px;">The purpose of these links is to firstly allow us to reliably log the click and then secondly redirect the visitor to the target PDP.<br /> In addition to the visible links there is also an "invisible link" inside the no script tag. The noscript tag prevents showing of the a tag by normal browsers but is found and executed by bots during crawling of the page.<br /> Here a link to a blog post where an SEO proved this year that the noscript tag is not ignored by bots: <a href="http://www.theseotailor.com.au/blog/hiding-keywords-noscript-seo-experiment/" target="_blank">http://www.theseotailor.com.au/blog/hiding-keywords-noscript-seo-experiment/<br /> </a> <br /> So the visible links are not obfuscating the PDP URL they have it encoded as it otherwise cannot be passed along as a URL query string. The plain PDP URL is part of the noscript tag ensuring discover-ability of PDPs by bots.</p> <p>Does anyone have anything in addition to this one blog post, to substantiate the claim that hiding our links in a <noscript> tag are in fact within the SEO Best Practice standards set by Google, Bing, etc...? </p> <p>Do you think that this method skirts the fine line of grey hat tactics? Will google/bing eventually penalize us for this?</p> <p>Does anyone have a better suggestion on how our 3rd party provider could track those clicks without using a URL redirect & hiding the actual PDP link?</p> <p>All insights are welcome...Thanks!</p> <p>Jordan K.</p></noscript></nofollowed>
White Hat / Black Hat SEO | | eImprovement-SEO0 -
Controlling crawl speed/delay through dynamic server-code and 503's
Lately i'm experiencing performance trouble caused by bot traffic. Although Googlebot is not the worst (it's mainly bingbot and ahrefsbot), they cause heavy server load from time to time. We run a lot of sites on one server, so heavy traffic on one site impacts other site's performance. Problem is that 1) I want a centrally managed solution for all sites (per site administration takes too much time), which 2) takes into account total server-load in stead of only 1 site's traffic and 3) controls overall bot-traffic in stead of controlling traffic for one bot. IMO user-traffic should always be prioritized higher than bot-traffic. I tried "Crawl-delay:" in robots.txt, but Googlebot doesn't support that. Although my custom CMS system has a solution to centrally manage Robots.txt for all sites at once, it is read by bots per site and per bot, so it doesn't solve 2) and 3). I also tried controlling crawl-speed through Google Webmaster Tools, which works, but again it only controls Googlebot (and not other bots) and is administered per site. No solution to all three of my problems. Now i came up with a custom-coded solution to dynamically serve 503 http status codes to a certain portion of the bot traffic. What traffic-portion for which bots can be dynamically (runtime) calculated from total server load at that certain moment. So if a bot makes too much requests within a certain period (or whatever other coded rule i'll invent), some requests will be answered with a 503 while others will get content and a 200. Remaining question is: Will dynamically serving 503's have a negative impact on SEO? OK, it will delay indexing speed/latency, but slow server-response-times do in fact have a negative impact on the ranking, which is even worse than indexing-latency. I'm curious about your expert's opinions...
White Hat / Black Hat SEO | | internetwerkNU1 -
Is this website being punished by Google?
Hi, I just took over the SEO for a friend of mine's website. Is this website being punished by Google? It has a strong link score, the homepage needs work as far as Key wording goes but it does not appear in Google's top 100 for any keyword. I am not sure that the last SEO company did some harm. Can anyone give me some tips on getting my friend back into the mix? www.wallybuysell.com
White Hat / Black Hat SEO | | CKerr0 -
What Google considers to be a branded keyword?
We can set our own keywords as branded in SeoMoz campaign, but Google would not necessarily see them like branded. After reading the Blog post at http://www.seomoz.org/blog/how-wpmuorg-recovered-from-the-penguin-update I had a question: Are there known rules (or at least guesses) what Google considers a branded keyword/anchor text? I guess the first one would be your website domain. So bluewidget.com for example would be a branded keyword for bluewidget.com website. How about Blue Widget or Blue Widget Company?
White Hat / Black Hat SEO | | SirMax0 -
Switching prices for google base
We would like to be able to submit lower prices to google than we do to other sources. How i see it working is that at the end of each url we submit to google base there is a tracking code (source=googlebase). When a user visits the site via one of these urls we would knock 10% of the price of that item and store the item in a cookie to ensure that the price of that item, for that user would stay at the low price for 24 hours. My question is whether google would have a problem with us doing this? The second part of my question is whether they check the full url including the query strings? If theyt just checked the canocial URL they would see a price thats 10% higher than the one we submitted to base - which, of course - would be bad
White Hat / Black Hat SEO | | supermarketonline0 -
How much time do you think Google employees spend reverse engineering what we do?
Lets face it, it's the corner stone of SEO, reverse engineering sites to guess at what big G does. It would just make sense they did the same to learn all our tactics.
White Hat / Black Hat SEO | | naffhampton1 -
Banned from google !
Hello, I realize (with GAnaltytics and command "link:") this morning that my domain host (share one) : "mlconseil.com" under which several websites are hosted has been banned from google. Here below the websites : www.amvo.fr :
White Hat / Black Hat SEO | | mozllo
www.apei-cpm.fr :
www.armagnac-les-vieux-chenes.fr
www.centraledelexpertise.fr
www.cleaning-pc-33.com
www.internet-33.fr
www.territoires-et-ntic.fr
www.vin-le-taillou.com
www.maliflo.asso.fr I don't kow why, i use since end of january 2011 IBP, only for some submissions to directories and for managing some lists of urls. I submitted about 30/40 directories never at the same time , but raher day after day, smoothly. On www.territoires-et-ntic.fr and www.amvo.fr which are blogs, i have installed some external rss feeds to display as articles, i decided to stop that but i don't know if it's related to such "blacklistage" from google. I don't use any nasty "blackhat" programs or else.. I'am really upset about that, i claim this morning with the same words as now, a new indexation but i don't know how long it will take ?Any idea ? Which are the tools which could help me to scan for maybe any malicious maleware on my hosting provider ? Many tks0