Does Switching Web Hosts Hurt SEO?
-
A few months ago, my site was shut down by BlueHost because of performance issues, so I moved it to WP Engine, and cleaned up most of the plug-ins. Since then, my search engine traffic has decreased over 50%. Does switching web hosts hurt SEO?
Thanks!
-
Yea, losing the redirects can definitely cause ranking and SEO problems. This can happen if the contents of the old site's .htaccess file didn't' get updated into the new site, or if using the WordPress Redirection plugin, the plugin somehow got deactivated or its redirects messed up.
Get those crawl errors fixed, then give the SEs some time to recrawl and see if most of the problem goes away.
By the way - nice looking site!
Paul
PS Marking whichever answer you found most helpful will help out other users and gives a bit of a points boost as well
-
Thanks again. Most of the crawl errors are 404. I think I had redirects on the old host that didn't transfer over. We're working on fixing those now.
I'm pretty sure I have Google Analytics on my 404 page - I have the Thesis template for Wordpress and have the Analytics set to load on every page.
-
Yea, the combination of a lot of crawl errors and the reduction in KB downloaded seem to indicate the SEs are having trouble indexing your site. especially if the crawl errors are 404s. The SEs will want to drop those pages from their index thinking they no longer exist, which will cost you the traffic.
What's the major category of crawl errors you're seeing? And do you have your Google Analytics code installed on your 404 page? (if not, you should).
Paul
-
Thanks Paul! The site traffic dropped the day of the move.
Google traffic dropped 31% and Bing traffic dropped 18%.
I just checked crawl errors and WOW! There are a lot. Could that be causing the problems? I also noticed that the kilobytes downloaded per day have decreased a lot since the move.
Thanks again.
-
As long as nothing's changed in terms of site structure/URLs and auxiliary files, changing servers shouldn't have any effect on SEO, Jodi.
First thing to make certain is that your auxiliary files (.htaccess, Robots.txt, and xml sitemap) are the same as on the old site. I've seen many instances where new default versions were applied on the new server, causing problems. Make sure your xml sitemap is in place, is complete and is updating correctly.
If your old .htaccess file had customized redirects in it, those may not have been transferred to the new server. (This would have been a manual process - needing to be done by whoever performed the migration.)
You can also pop into Google Webmaster Tools and make sure it's not showing any unexpected crawling errors or increased 404s. Also check that the crawl rate seems to be about the same as it was before the move.
Did you see the traffic drop exactly at the time of the server move? Or did happen gradually over a couple of weeks? Or did you see it drop quickly, but not right at the move date? I ask because it's necessary to try to eliminate the possibility that something else affected traffic at the same time as the move, but not move-related.
There have been quite a few major algorithm updates in the last couple of months that could also be culprits.
Use your analytics to see if organic traffic from Google and Bing have both dropped at the same time and about the same rate. This could tell you whether one SE's algorithm change was hitting, or whether both SEs suffered equally as a result of a possible structural problem after the server move.
Let us know what you find & good luck!
Paul
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Negative SEO
I have got some idiot bombarding my site with spam links, most likely a negative SEO attempt. It started off very small and has ramped up to between five and 10 spam links per day. I think it may be having a slightly negative affect although difficult to tell. I can do a link cleanup, but I'm not sure that is a long term solution, I'm just going have to do it again in a few weeks time. Does anyone have any experience?
Technical SEO | | seoman100 -
Best Web-site Structure/ SEO Strategy for an online travel agency?
Dear Experts! I need your help with pointing me in the right direction. So far I have found scattered tips around the Internet but it's hard to make a full picture with all these bits and pieces of information without a professional advice. My primary goal is to understand how I should build my online travel agency web-site’s (https://qualistay.com) structure, so that I target my keywords on correct pages and do not create a duplicate content. In my particular case I have very similar properties in similar locations in Tenerife. Many of them are located in the same villa or apartment complex, thus, it is very hard to come up with the unique description for each of them. Not speaking of amenities and pricing blocks, which are standard and almost identical (I don’t know if Google sees it as a duplicate content). From what I have read so far, it’s better to target archive pages rather than every single property. At the moment my archive pages are: all properties (includes all property types and locations), a page for each location (includes all property types). Does it make sense adding archive pages by property type in addition OR in stead of the location ones if I, for instance, target separate keywords like 'villas costa adeje' and 'apartments costa adeje'? At the moment, the title of the respective archive page "Properties to rent in costa adeje: villas, apartments" in principle targets both keywords... Does using the same keyword in a single property listing cannibalize archive page ranking it is linking back to? Or not, unless Google specifically identifies this as a duplicate content, which one can see in Google Search Console under HTML Improvements and/or archive page has more incoming links than a single property? If targeting only archive pages, how should I optimize them in such a way that they stay user-friendly. I have created (though, not yet fully optimized) descriptions for each archive page just below the main header. But I have them partially hidden (collapsible) using a JS in order to keep visitors’ focus on the properties. I know that Google does not rank hidden content high, at least at the moment, but since there is a new algorithm Mobile First coming up in the near future, they promise not to punish mobile sites for a collapsible content and will use mobile version to rate desktop one. Does this mean I should not worry about hidden content anymore or should I move the descirption to the bottom of the page and make it fully visible? Your feedback will be highly appreciated! Thank you! Dmitry
Technical SEO | | qualistay1 -
Panda and Large Web Presence
I'm experiencing some recent significant drops in rankings across the board for a client of mine and I suspect that it's probably related to Panda. Their internet presence features completely unique, useful, well written content by certified industry experts. Further, all content is of proper length and again serves a core purpose, providing helpful information to their viewers. Where I think things potentially go wrong is that they have around 20 micro sites in operation, including multiple web 2.0 blogs. There are also multiple sites in operation that target more specific areas of the same city. Again all of the content is unique, but they all feature content that's of the same industry and broad topic. Despite everything being 100% unique, I fear it's too excessive. Anyone know if Panda may target this type of approach even if the quality and uniqueness is appropriate?
Technical SEO | | BrandishJay0 -
Need Third Party Input. Our Web host blocked all bots including Google and myself because they believe SEO is slowing down their server.
I would like some third party input... partly for my sanity and also for my client. I have a client who runs a large online bookstore. The bookstore runs in Magento and the developers are also apparently the web host. (They actually run the servers.. I do not know if they are sitting under someones desk or are actually in a data center) Their server has been slowed down by local and foreign bots. They are under the impression my SEO services are sending spammer bots to crawl and slow down their site. To fix the problem they disallowed all bots. Everything, Google, Yahoo, Bing. They also banned my access from the site. My clients organic traffic instantly took a HUGE hit. (almost 50% of their traffic is organic and over 50% is Organic + Adwords most everything from Google) Their keyword rankings are taking a quick dive as well. Could someone please verify the following as true to help me illustrate to my client that this is completely unacceptable behavior on part of the host. I believe: 1.) You should never disavow ALL robots from your site as a solution for spam. As a matter of fact most of the bad bots ignore robots.txt anyways. It is a way to limit where Google searches (which is obviously a technique to be used) 2.) On site SEO work as well as link building, etc. is not responsible for foreign bots and scrappers putting a heavy load on the server. 3.) Their behavior will ultimately lead to a massive loss of rankings (already happening) and a huge loss of traffic (already happening) and ultimately since almost half the traffic is organic the client could expect to lose a large sum of revenue from purchases made by organic traffic since it will disappear. Please give your input and thoughts. I really appreciate it!
Technical SEO | | JoshuaLindley1 -
Domains and Hosting Question
I bought hosting for unlimited domains on Godaddy. It's not a dedicated server. It was just $85 a year. I have unlimited latency but a limited amount of "space." I don't know a lot about hosting servers etc... My question is relatively simple. When I go in GoDaddy to my hosting. There is a site that shows up as hosted, and all of the other sites show up under that site in it's directory. If you type the name of the site I bought the hosted package on, then type a forward slash and the name of one of the other sites on the hosting package, you will actually go to the other website. What is this relationship? Is it normal? Does that make all of my websites subdomains of the main site (that I bought the hosting package on)? I don't fully comprehend how this effects everything...
Technical SEO | | JML11790 -
Hosting sitemap on another server
I was looking into XML sitemap generators and one that seems to be recommended quite a bit on the forums is the xml-sitemaps.com They have a few versions though. I'll need more than 500 pages indexed, so it is just a case of whether I go for their paid for version and install on our server or go for their pro-sitemaps.com offering. For the pro-sitemaps.com they say: "We host your sitemap files on our server and ping search engines automatically" My question is will this be less effective than my installing it on our server from an SEO perspective because it is no longer on our root domain?
Technical SEO | | design_man0 -
Site Hosting Question
We are UK based web designers who have recently been asked to build a website for an Australian Charity. Normally we would host the website in the UK with our current hosting company, but as this is an Australian website with an .au domain I was wondering if it would be better to host it in Australia. If it is better to host it in Australia, I would appreciate if someone could give me the name of a reasonably priced hosting company. Thanks Fraser
Technical SEO | | fraserhannah0 -
Is IP-Hosting an effective and worthy SEO solution?
I have seen few websites selling same package of IP-Hosting, I consider that kind of smart SEO solution if it really be effective, But I have doubt. I have a few websites using a dedicated server for 6 years with only 1 valid IP and with that experience, I think rich content, title tags and anchor links are most important factors but I never tried such IP SEO, has any one tried IP hosting? Then comes second questions, If you have experienced and recommend IP Hosting, the reasonable application could be wordpress to manage all the sites in diffrent servers, and by using plugins such as autoblog or RSS importer we can update all the sites with minimum effort, But the problem is that neither of these 2 plugins are really working hassle free, They do not work at all or they stop wroking all the sudden, preferably I need a solution for Wordpress multisite Would you offer any alternative? I appreciate if you share your ideas... P.S.: My objective to apply IP hosting is not geo-targeting I use to launch .COMs normally without any particular geo-target.
Technical SEO | | Pooria0