Can unreliable server hurt your serps?
-
Hi
I have a small personal blog about food and wine that I recently moved from blogger to wordpress and it is currently being hosted by the company who moved the blog over from one system to the other.
This week I've noticed an 80% drop in organic traffic thanks to losing pretty much every first page SERP, there's no messages in WMT, i dont pay for links, all that's on the site is original content about food and wine that I enjoy. I've never had a previous drop in ranking/traffic like it.
The one thing I can say, is that the guy who took over the hosting is hosting it himself on his own server and the website has been down more times than I would consider reasonable, often for hours at a time (this is when I catch it and I don't check often).
-
Would the site be penalised for this?
-
If I move servers to a reliable co i normally use, how long will it take to recover?
-
-
I can watch soft 404 errors increase and know there will be a week to two week period with lower SERP scores.
They absolutely can affect your campaign.
-
Hi agree with both Andy and Michael.
If you are going to ditch which seems an unreliable server, make sure your new platform delivers speed. This article by KISSmetrics highlights it's importance http://blog.kissmetrics.com/loading-time/
Good luck
Gary
-
If the server keeps going down I would also look at and consider page load speed - http://www.webpagetest.org/ because if it is that flaky I would guess performance is poor all round.
I would get to a stable server as quick as possible.
-
Google has said in the past that the odd outage shouldn't be a problem, but if it is like you are saying, then yes, this could be very harmful. Matt Cutts said that if every time someone comes to your site it is down, then it will have a negative impact.
Recovery is a very open-ended question though. It could be a week, a month or 3 months - there is no way to determine this sadly and anything here is going to be guesswork.
I would say to move the site as soon as you can.
-Andy
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can Google+ help you to rank?
I'm thinking about uploading photos to my Google+ and then embedding them in my post. Will adding photo from Google+ help me rank better.
Content Development | | WilCross1 -
Can We Publish Duplicate Content on Multi Regional Website / Blogs?
Today, I was reading Google's official article on Multi Regional website and use of duplicate content. Right now, We are working on 4 different blogs for following regions. And, We're writing unique content for each blog. But, I am thinking to use one content / subject for all 4 region blogs. USA: http://www.bannerbuzz.com/blog/ UK: http://www.bannerbuzz.co.uk/blog/ AUS: http://www.bannerbuzz.com.au/blog/ CA: http://www.bannerbuzz.ca/blog/ Let me give you very clear ideas on it. Recently, We have published one article on USA website. http://www.bannerbuzz.com/blog/choosing-the-right-banner-for-your-advertisement/ And, We want to publish this article / blog on UK, AUS & CA blog without making any changes. I have read following paragraph on Google's official guidelines and It's inspire me to make it happen. Which is best solution for it? Websites that provide content for different regions and in different languages sometimes create content that is the same or similar but available on different URLs. This is generally not a problem as long as the content is for different users in different countries. While we strongly recommend that you provide unique content for each different group of users, we understand that this may not always be possible. There is generally no need to "hide" the duplicates by disallowing crawling in a robots.txt file or by using a "noindex" robots meta tag. However, if you're providing the same content to the same users on different URLs (for instance, if both example.de/ and example.com/de/ show German language content for users in Germany), you should pick a preferred version and redirect (or use the rel=canonical link element) appropriately. In addition, you should follow the guidelines on rel-alternate-hreflang to make sure that the correct language or regional URL is served to searchers.
Content Development | | CommercePundit0 -
How can I make a clickable header on Tumblr (with several links to click)
I would like to make a clickable header with several links to click, for example the possibility to click Facebook & Twitter icons to get redirected to my twitter and Facebook page. I know how to make the clickable image and get the html for it. But where in the HTML on tumblr should I insert it? Can I override the custom header with my HTML header somehow? Appreciate all the help I can get, Thanks.
Content Development | | Fisken0 -
Can I share share my content with other sites either as a individual post or RSS or will I be encourage duplicate content on the web and upset Google
Hi, I have a new site http://www.homeforbusiness.co.uk. I want to encourage traffic to the site by sharing some of my content with other related websites which have a higher PR ranking and traffic for a link to my site. Is this going to upset Google re-duplicate content and devalue my site and stop any organic rankings in the future? Equally some high PR sites which have a good synergy with mine such as http://thewomensbusinessclubs.com/ allow me to add my RSS feed with their blog network. Is this a good thing to do or not for the same reasons as above? Or can I only do the above my creating fresh content? Thanks, Elizabeth Conley
Content Development | | econley0 -
How can i solve duplicate problem with different url needed?
My client is a big international firm with 10 websites with different url (.co.uk, .com, .com.au, .pl... etc). All websites are exactly the same except the price. I suggested them to only use .com and use region as a sub domain like au.xxx.com instead of xxx.com.au. However they cannot do that for some reason. I am trying to solve the duplicate issue. I dont think i can use 301 redirect or canonial link because all regions are making even traffics. Any suggestions?
Content Development | | ringochan0 -
Can you use creative commons non-commercial images on a company blog?
Does anyone know if it is okay to use creative commons images on your company blog if they are under the Attribution-NonCommercial-NoDerivs 2.0 Generic license. Technically you are using it on a commercial site, but you are not directly making money from the image or selling it.
Content Development | | ProjectLabs0 -
Can free article sites damage my reputation
Hi i have been submitting a lot of fresh articles on free article sites and they have been picked up which is good but my problem is, i am finding loads of these sites who have picked up my article using some sort of article rewrite programme and my articles look like they have now been written by a two year old. I want to know if this is going to spoil my reputation and i would like to know why people use these programmes when they do not even make sense.
Content Development | | ClaireH-1848860 -
I have a page where you can download a PDF of the material - should I exclude the PDF from the search engines?
In my niche, there is a controversial research article that is very popular. I am writing a rebuttal to this article and giving another point of view. My article has the potential to be really good link bait for my site. The original article is often printed out to be shown to professionals in my niche. My hope is that people will do the same with mine. So, I plan to have a PDF version of my article available on my page. The article that is visible on my site (i.e. non PDF) will be a graphic rich article that is easy for the reader to go through. I plan to have the PDF have all of the same text, but it won't have as many graphics - it will look more like a scientific research article. So, should I exclude the pdf from search engines so that it isn't duplicate content? Or does that even matter seeing as it is a duplicate of my own content? I want people to link to the main article, not the pdf. Any tips would be greatly appreciated!
Content Development | | MarieHaynes1