Changing Hosting Companies - Site Downtime - Google Indexing Concern
-
We are getting ready to switch to a new hosting company. When we make the switchover, our sites will be offline for a couple of hours and in some cases perhaps as long as 12 hours while DNS is configured -- should we be worried about Google trying to index pages and finding them unavailable? Any fear of Google de-indexing pages. Our guess was that Google would not de-index anything after just a short period of not being able to find pages -- it would have to be over an extended period of time before GOOGLE or BING would de-index pages -- CORRECT?
Just want to gut check this before pulling the trigger on switch over to new hosting company. We appreciate input on this and/or any other thoughts regarding the switch over to new hosting company that we may not have thought of.
Thanks,
- Matt
-
thanks, good resource
-
There's an excellent blog post about this at http://www.seomoz.org/blog/how-to-handle-downtime-during-site-maintenance too.
-
Thanks, appreciate it.
-
I would say it is preferable to go ahead and setup the site on the new host first before you take down the old one. While you should be "ok" with the downtime, I would not recommend it. You never know when the spiders come along. You would probably not be de-indexed, but Google would see a bunch of errors and you could potentially see a drop in the SERPS and then traffic to your site as a result. This should all recover.
I have seen on our sites drops in traffic when we have had technical difficulties. I usually see issues in GWT or other tools and I get them fixed and the traffic comes back.
Here is the other thing. What if something goes wrong during that 12 hours? What if 12 hours becomes 24, 48 etc. due to unforseen issues. That is just bad for business/users in general when a site is down for any amount of time. You do not want that, let alone the search engine issue. What if something goes wrong with the new host and you need to revert back to the old? This has happened to me and trust me, you do not want this to happen to you. Murphy likes to play games with scenarios like this, I do not mess with Mr. Murphy and his laws.
If I were you here is what I would do
-
setup the new host
-
setup your site on the new host
-
test test test on the new host
-
change the DNS from the old to the new host
-
watch the traffic move
-
test test test on the new host
-
shut down the old host
We have overlapped for up to 2 months with old and new hosts just to make sure everything is set. You always back up your data yes? Why would you not want to have a backup with your entire website?
Good luck!
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How do internal search results get indexed by Google?
Hi all, Most of the URLs that are created by using the internal search function of a website/web shop shouldn't be indexed since they create duplicate content or waste crawl budget. The standard way to go is to 'noindex, follow' these pages or sometimes to use robots.txt to disallow crawling of these pages. The first question I have is how these pages actually would get indexed in the first place if you wouldn't use one of the options above. Crawlers follow links to index a website's pages. If a random visitor comes to your site and uses the search function, this creates a URL. There are no links leading to this URL, it is not in a sitemap, it can't be found through navigating on the website,... so how can search engines index these URLs that were generated by using an internal search function? Second question: let's say somebody embeds a link on his website pointing to a URL from your website that was created by an internal search. Now let's assume you used robots.txt to make sure these URLs weren't indexed. This means Google won't even crawl those pages. Is it possible then that the link that was used on another website will show an empty page after a while, since Google doesn't even crawl this page? Thanks for your thoughts guys.
Intermediate & Advanced SEO | | Mat_C0 -
How to make Google index your site? (Blocked with robots.txt for a long time)
The problem is the for the long time we had a website m.imones.lt but it was blocked with robots.txt.
Intermediate & Advanced SEO | | FCRMediaLietuva
But after a long time we want Google to index it. We unblocked it 1 week or 8 days ago. But Google still does not recognize it. I type site:m.imones.lt and it says it is still blocked with robots.txt What should be the process to make Google crawl this mobile version faster? Thanks!0 -
Copying contents from a blog site (External) to a company blogsite (internal)
Hi, I have a client that has several external blogs www.blogsite1.info www.blogsite2.info and he also has the www.companywebsite.com the main domain of course is the comapnywebsite.com. They are doing some thing wrong, because instead of generating contents inside the main domain, the create contents in the blogsites and send links to the blogsites to see those contents. So they are inviting their users to EXIT the website... So, I told him, If you want to generate contents, please keep a blog INSIDE your domain www.companywebsite.com/blog, but keep the other ones, cause they are generating links (they are .info domains, that is not good, but they are nice keyword match domains) Now, he told me he was thinking on copy and paste the contents from the external blogsites to the internal website. I warned him about generating duplicate content. But.... is it really a problem? They are not in the same domain... Could google give a penalty because of that to the main domain? Thanks!
Intermediate & Advanced SEO | | teconsite0 -
How do you achieve Google Authorship verification on a site with no clearly defined authors?
Google Authorship seems to be the current buzz topic in SEO. It seems perfect for people who write lots of articles of blog posts, but what about sites where the main focus isn't articles e.g. e-commerce sites? Can the website as a whole get verified?
Intermediate & Advanced SEO | | statman870 -
Best practice for removing indexed internal search pages from Google?
Hi Mozzers I know that it’s best practice to block Google from indexing internal search pages, but what’s best practice when “the damage is done”? I have a project where a substantial part of our visitors and income lands on an internal search page, because Google has indexed them (about 3 %). I would like to block Google from indexing the search pages via the meta noindex,follow tag because: Google Guidelines: “Use robots.txt to prevent crawling of search results pages or other auto-generated pages that don't add much value for users coming from search engines.” http://support.google.com/webmasters/bin/answer.py?hl=en&answer=35769 Bad user experience The search pages are (probably) stealing rankings from our real landing pages Webmaster Notification: “Googlebot found an extremely high number of URLs on your site” with links to our internal search results I want to use the meta tag to keep the link juice flowing. Do you recommend using the robots.txt instead? If yes, why? Should we just go dark on the internal search pages, or how shall we proceed with blocking them? I’m looking forward to your answer! Edit: Google have currently indexed several million of our internal search pages.
Intermediate & Advanced SEO | | HrThomsen0 -
Does Google punish sites for Backlinks?
Here is Matt Cutts video, for those of you who have not seen it already. http://www.youtube.com/watch?v=f4dAWb5jUws (Very Short) In this Video Matt explains that Google does not look at backlinks. Many link spamming sites have detected, there have been many website receiving warning messages in their Google web tools to deindex these links, etc.. My theory is that Google will not punish sites for backlinks. However, they manually check for "link farming sites" and warn anyone affiliated with them, just in case these links were built from a competitor. This way they can eliminate all the "Bad Link Farm" sites and not hurt anyone who does not deserve to be hurt. Google is not going to give us all their information to rank, they dont want us to rank. They want us to PPC. However, they do want to have the best SERPs available. I call it Google juggling! Thoughts?
Intermediate & Advanced SEO | | SEODinosaur0 -
Google+ Verification - Site Speed Optimization
So the Google+ Badge verifies our site for Google direct connect. However, the javascript code for the badge itself causes the page to load 3 to 4 seconds longer, which is a big deal. Any ideas for a work around?
Intermediate & Advanced SEO | | inc.com0 -
Site: on Google
Hello, people. I have a quick question regarding search in Google. I use search operator [site:url] to see indexing stauts of my site. Today, I was checking indexing status and I found that Google shows different numbers of indexed pages depends on search setting. 1. At default setting (set as 10 search result shows) > I get about 150 pages indexed by Google. 2. I set 100 results shows per page and tried again. > I get about 52 pages indexed by Google. Of course I used same page URL. I really want to know which data is accurate. Please help people!!
Intermediate & Advanced SEO | | Artience0