Instability on the server - punishment in Google?
-
My site has about 50k pages indexed in Google. We are well respected, and we believe our stories add much value to the user. However, we are having serious problems with my hosting service and the site is unavailable for more than 15 hours.
Unfortunately we also have no provision for him to return to work. I wonder if this kind of instability can cause some punishment on Google, if so, I wonder if there is anything we can do to tell Google that we are aware and working to resolve the problem.
-
I still don't think you will be harmed by Google. Keri posted a good link below regarding a down website. You should give it a read. I did.
-
Hi Clovis,
This recent YouMoz post has some recommendations of what to do when your site is down. Not sure if you'll be able to use them, but it's worth a read. Best of luck!
http://www.seomoz.org/blog/how-to-handle-downtime-during-site-maintenance
-
Thanks, for you attention!
-
I don't understand when you say Unfortunately wealso have no provision for him to return to work.
I'm sorry, I'm Brazilian, don't speak english very well.
I wanted to say that time is not a prediction for the site to return to work. We have a backup but it is very old, 3 months ago. This could also be harmful?
-
Sounds like a tough situation. Once your site gets back online, visit your Google Webmaster tools to see what they are saying in there. You might be able to reply with a message telling them what happened.
As long as you can get your site back online, you might only see a drop (if any) for a few days.
Good luck!
-
Clovis
You state:** However, we are having serious problems with my hosting service and the site is unavailable for more than 15 hours.**
Do you have a plan to have the site back up shortly? Next few days even?
** I wonder if this kind of instability can cause some punishment on Google**
Assuming it is not a regular occurrence and the site will be back up in the next day or two even, I do not believe it will be a problem with Google. One thing you could do as soon as it is back up is request to have it reindexed in GWMT. That just gets anything that may have been missed addressed.
I don't understand when you say **Unfortunately we also have no provision for him to return to work. **
Are you meaning the server or an employee who in some way has caused the problem? Again, everything will center around your timetable with getting the site back up. If you can fill in some of the details it will help mozzers in assisting you.
best
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Does Google and Other Search Engine crawl meta tags if we call it using react .js ?
We have a site which is having only one url and all other pages are its components. not different pages. Whichever pages we click it will open show that with react .js . Meta title and meta description also will change accordingly. Will it be good or bad for SEO for using this "react .js" ? Website: http://www.mantistechnologies.com/
White Hat / Black Hat SEO | | RobinJA0 -
Any more info on potential Google algo update from April 24th/25th?
Apart from an article on Search Engine Roundtable, I haven’t been able to find anything out about the potential algorithm update that happened on Monday / Tuesday of this week. One of our sites (finance niche) saw drops in rankings for bad credit terms on Tuesday, followed by total collapse on Wednesday and Thursday. We had made some changes the previous week to the bad credit section of the site, but the curious thing here is that rankings for bad credit terms all over the site (not just the changed section) disappeared. Has anyone else seen the impact of this change, and are there any working theories on what caused it? I’m even wondering whether a specific change has been made for bad credit terms (i.e. the payday loan update)?
White Hat / Black Hat SEO | | thatkinson0 -
How to deal with spam heavy industries that haven't gotten the hammer from Google?
One of our clients works in the video game category - specifically, helping people rank higher in games like League of Legends. In spite of our trying to do things the right way with white hat link building, we've suffered when trying to compete with others who are using comment and forum spam, private blog networks, and other black hat tactics. Our question is - what is the right approach here from a link building perspective? Is it an "if you can't beat them, join them" or do we wait it out and hope Google notices and punishes those who don't play nice? Some test terms to see what we're up against: "elo boost" and "lol coach." Would love to hear thoughts from anyone who's dealt with a similar situation.
White Hat / Black Hat SEO | | kpaulin0 -
How does google know if rich snippet reviews are fake?
According to: https://developers.google.com/structured-data/rich-snippets/reviews - all someone has to do is add in some html code and write the review. How does google do any validation on whether these reviews are legitimate or not?
White Hat / Black Hat SEO | | wlingke0 -
Controlling crawl speed/delay through dynamic server-code and 503's
Lately i'm experiencing performance trouble caused by bot traffic. Although Googlebot is not the worst (it's mainly bingbot and ahrefsbot), they cause heavy server load from time to time. We run a lot of sites on one server, so heavy traffic on one site impacts other site's performance. Problem is that 1) I want a centrally managed solution for all sites (per site administration takes too much time), which 2) takes into account total server-load in stead of only 1 site's traffic and 3) controls overall bot-traffic in stead of controlling traffic for one bot. IMO user-traffic should always be prioritized higher than bot-traffic. I tried "Crawl-delay:" in robots.txt, but Googlebot doesn't support that. Although my custom CMS system has a solution to centrally manage Robots.txt for all sites at once, it is read by bots per site and per bot, so it doesn't solve 2) and 3). I also tried controlling crawl-speed through Google Webmaster Tools, which works, but again it only controls Googlebot (and not other bots) and is administered per site. No solution to all three of my problems. Now i came up with a custom-coded solution to dynamically serve 503 http status codes to a certain portion of the bot traffic. What traffic-portion for which bots can be dynamically (runtime) calculated from total server load at that certain moment. So if a bot makes too much requests within a certain period (or whatever other coded rule i'll invent), some requests will be answered with a 503 while others will get content and a 200. Remaining question is: Will dynamically serving 503's have a negative impact on SEO? OK, it will delay indexing speed/latency, but slow server-response-times do in fact have a negative impact on the ranking, which is even worse than indexing-latency. I'm curious about your expert's opinions...
White Hat / Black Hat SEO | | internetwerkNU1 -
Google SEVERE drop as of last week (oct 10) on long standing .org site
Hello Experts Wanted some imput if possible. I own a .org informational site that has been #1 in its category for Google Yahoo and Bing under a major keyword for years. The site is aged back to 2005 and all of the sudden it dropped on August 10 (Google only- Yahoo and Bing still #1)) but remained atop the primary keywords that it is namesaked for .org (xxxxyyyzzz.org) and then Oct 9-10 it dropped from the page 1 top ranking it had for years on that primary keyword to page 13. I dont know where to begin to look. Any ideas how something like this could happen and what "Stones" I should turn. We purchased the website and are not SEO gurus so just not sure. Any help would be appreciated
White Hat / Black Hat SEO | | TBKO1 -
Interesting case of IP-wide Google Penalty, what is the most likely cause?
Dear SEOMOZ Community, Our portfolio of around 15 internationalized web pages has received a significant, as it seems IP-wide, Google penalty starting November 2010 and have yet to recover from it. We have undergone many measure to lift the penalty including reconsideration requests wo/ luck and am now hoping the SEOMoz community can give us some further tips. We are very interested in the community's help and judgement what else we can try to uplift the penalty. As quick background information, The sites in question offers sports results data and is translated for several languages. Each market, equals language, has its own tld domain using the central keyword, e.g. <keyword_spanish>.es <keyword_german>.de <keyword_us>.com</keyword_us></keyword_german></keyword_spanish> The content is highly targeted around the market, which means there are no duplicate content pages across the domains, all copy is translated, content reprioritized etc. however the core results content in the body of the pages obviously needs to stay to 80% the same A SEO agency of ours has been using semi-automated LinkBuilding tools in mid of 2010 to acquire link partnerships There are some promotional one-way links to sports-betting and casino positioned on the page The external linking structure of the pages is very keyword and main-page focused, i.e. 90% of the external links link to the front page with one particular keyword All sites have a strong domain authority and have been running under the same owner for over 5 years As mentioned, we have experienced dramatic ranking losses across all our properties starting in November 2010. The applied penalties are indisputable given that rankings dropped for the main keywords in local Google search engines from position 3 to position 350 after the sites have been ranked in the top 10 for over 5 years. A screenshot of the ranking history for one particular domain is attached. The same behavior can be observed across domains. Our questions are: Is there something like an IP specific Google penalty that can apply to web properties across an IP or can we assume Google just picked all pages registered at Google Webmaster? What is the most likely cause for our penalty given the background information? Given the drops started already in November 2010 we doubt that the Panda updates had any correlation t this issue? What are the best ways to resolve our issues at this point? We have significant history data available such as tracking records etc. Our actions so far were reducing external links, on page links, and C-class internal links Are there any other factors/metrics we should look at to help troubleshooting the penalties? After all this time wo/ resolution, should we be moving on two new domains and forwarding all content as 301s to the new pages? Are the things we need to try first? Any help is greatly appreciated. SEOMoz rocks. /T cxK29.png
White Hat / Black Hat SEO | | tomypro0 -
Server Side Java Script Redirects
I would like to use a redirect through a server based Java script to redirect visitors only referenced from a certain site. So let's say anyone clicking on a link to my site page-A from seomoz.org would automatically be redirected to page-B. All other users as well as direct and search engine traffic would only see the regular page A. The reason I am doing this is because the linking site is linking to page A which doesn't serve the user the correct content. Rather than contacting the webmaster to change the link to point to page -B, I want to redirect them. Is there any danger of Google penalizing this for cloaking? and how would they be able to tell?
White Hat / Black Hat SEO | | zachc_coffeeforless.com0