Periodic DNS Switching for Major Website Updates - Any Downsides?
-
A company is performing some major updates to a website and the proposal to go live with the updates was explained as follows:
Once the updates are done on the testing environment and the site is ready to go live, we switch the DNS to the testing environment and then this testing environment becomes the production site. And the old production site becomes the new testing environment.
Are there any potential negatives to this?
Is there a name for this technique?
Of course, we've already considered :
- additional hosting cost
- potential performance differences- reinstalling and setting up server settings - SSL, etc.
-
I don't understand how we'd lose traffic...some visitors would see old site and some would see new site until fully propagated, right?
The problem with changing DNS is an initial traffic drop as routers/hubs/ gets the update.
Quote REF: http://www.mattcutts.com/blog/moving-to-a-new-web-host/
Step 3: Change DNS to point to your new web host.
This is the actual crux of the matter. First, some DNS background. When Googlebot(s) or anyone else tries to reach or crawl your site, they look up the IP address, so mattcutts.com would map to an IP like 63.111.26.154. Googlebot tries to do reasonable things like re-check the IP address every 500 fetches or so, or re-check if more than N hours have passed. Regular people who use DNS in their browser are affected by a setting called TTL, or Time To Live. TTL is measured in seconds and it says “this IP address that you fetched will be safe for this many seconds; you can cache this IP address and not bother to look it up again for that many seconds.” After all, if you looked up the IP address for each site with every single webpage, image, JavaScript, or style sheet that you loaded, your browser would trundle along like a very slow turtle.
If you read this page you'll see Matt Cutts tested mattcutts.com himself and did not see any major impact. However, Matt Cutts has a high profile domain since he is well known for talking about his experience within Google.
The point is the test environment works perfectly right now. If the files are migrated over to the live environment, then we could have issues. But if we simply switch the DNS to the test environment, we know that it will work fine.
I would concede this point if the major updates are operating in a different test environment then the live environment. By environment I mean different server architecture, like different php / asp versions or database types/versions that the current live server can not or will not be updated to. When you create a test environment you generally want to duplicate the live environment so you can simply push the test elements live once complete.If the server architecture is part of the test then I can't argue with the logic.
-
When you switch DNS you are at the mercy of the how fast the DNS propagates through the inter web.
how fast this propagates isn't really an issue for us.
Larger sites that see a lot of traffic daily are likely indexed more frequently and would thus suffer less traffic loss.
I don't understand how we'd lose traffic...some visitors would see old site and some would see new site until fully propagated, right?
If not then why switch DNS if you don't have to?
The point is the test environment works perfectly right now. If the files are migrated over to the live environment, then we could have issues. But if we simply switch the DNS to the test environment, we know that it will work fine.
-
Switching DNS is not an optimal solution in most cases. When you switch DNS you are at the mercy of the how fast the DNS propagates through the inter web. Larger sites that see a lot of traffic daily are likely indexed more frequently and would thus suffer less traffic loss.
If your domain is performing major updates switching the DNS at the same time is even more so ill advised. Would you not want to see how these major updates affect your website first? If you switch the DNS and see a huge traffic loss then you now are left with trying to figure out "did our updates hurt" or "is this just a DNS propagation issue" or combinations thereof?
My advice, take a two tier approach if the ultimate goal is to move DNS.
Step 1: Update Site On Same DNS
Step 2: Update DNS after updates are proven to be valuable to the users.If not then why switch DNS if you don't have to?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
I want to use a photo from an official website for my own website.IF YES HOW?
Lets suppose i downloaded a photo from a XYZ website and want to use it on my own website, and also i want to rank for same keyword, and would like to rank just below XYZ site, i know there could be copyright issue. what can be done to avoid this issue. Can i manipulate the picture in a such way that it is usable. if yes how? How can i use that official websites picture for my website, i mean, can i cite that website as a source? what is the best practice in this case? i dont want to use stock photo,i really like xyz sites pics.
Intermediate & Advanced SEO | | Sam09schulz0 -
Temporarily redirecting a small website to a specific url of another website
Hi, I would like to redirect a small website that contains info about a specific project temporarily to a specific url about this project on my main website. Reason for this is that the small website doesn't contain accurate info anymore. We will adapt the content in the next few weeks and then remove the redirect again. Should I set up a 301 or a 302? Thanks
Intermediate & Advanced SEO | | Mat_C1 -
Redirects & Authority when Updating Product Pages
Hi Quick question on SEO & product pages. We're changing suppliers, so discontinuing their range, adding new - but the products will be very similar - almost identical in some cases. I don't want to lose authority built up from current product pages, the only way to reuse these pages is to reuse SKUs - which we can't do. If I am redirecting these pages to new products which are similar, I know page authority will be passed - so is this the best option? Our links on the website will actually point to the final URL, rather than going through a redirect - if this is the case will it still pass authority? Thank you Becky
Intermediate & Advanced SEO | | BeckyKey0 -
Keyword Stuffing - Ecommerce websites
Hey Mozzers, Im undertaking a content audit and its going very well, we have written some better content for the first set of pages, it still needs some improvement but we have a good base and starting point from which we can make an SEO log and work on it over time. For the content I used the following formula for how many times to include a keyword Word Count / Length of Keyword. (eg. 600 words / 3 word keyword = 200). Then 1-4% of this (2-8 times). This has worked well for me in the past and has been a good base guide. I have ran the pages through Moz optimiser and every single page hit an A for keyword page optimisation. However many of the pages failed on keyword stuffing, which obviously has high priority. My dilemma is that, moz counts 15 as the cut off for keyword stuffing with the written text we have done really well with using it a set number of times. But these pages are product category pages. The keyword in the extreme of cases is listed 7-9 times in the side nav menu. 7-9 times in the product category listings. Take for example *** it is optimised for thermometers (i know it a tough single word keyword, and we have fairly modest aims with it, im using it here for example purposes). The word is used a good number of times within the article but is sent through the roof with the links to the sub categories. This page for example mentions the keyword 30 times. Can anybody suggest any ways to improve on this? Is how we display the categories in the nav bar and in the page excessive? As always many thanks!
Intermediate & Advanced SEO | | ATP0 -
Why my website disappears for the keywords ranked, then reappears and so on?
Hello to everyone. In the last 2 weeks my website emorroidi.imieirimedinaturali.it has a strange behavior in SERP: it disappears for the keywords ranked and then reappears, and so on. Here's the chronicle of the last days: 12/6: message in GWT: Improvement of the visibility of the website in search. 12/6 the website disappears for all the keywords ranked 16/6 the website reappears for all the keywords ranked with some keywords higher in ranking 18/6 the website disappears for all the keywords ranked 22/6 the website reappears for all the keywords ranked 24/6 the website disappears for all the keywords ranked... I can't explain this situation. Could it be a penalty? What Kind? Thank you.
Intermediate & Advanced SEO | | emarketer0 -
Need help for new website!
I want to a make new website. Can you please advise me what all things are involved which I should keep in mind before and during the website preparation. Like how to make pages, what to include in website, best way to create pages etc. Please provide me the link where I can study all the above information. I am planning to create global printing website.
Intermediate & Advanced SEO | | AlexanderWhite0 -
Panda Updates - robots.txt or noindex?
Hi, I have a site that I believe has been impacted by the recent Panda updates. Assuming that Google has crawled and indexed several thousand pages that are essentially the same and the site has now passed the threshold to be picked out by the Panda update, what is the best way to proceed? Is it enough to block the pages from being crawled in the future using robots.txt, or would I need to remove the pages from the index using the meta noindex tag? Of course if I block the URLs with robots.txt then Googlebot won't be able to access the page in order to see the noindex tag. Anyone have and previous experiences of doing something similar? Thanks very much.
Intermediate & Advanced SEO | | ianmcintosh0 -
Duplicate content, website authority and affiliates
We've got a dilemma at the moment with the content we supply to an affiliate. We currently supply the affiliate with our product database which includes everything about a product including the price, title, description and images. The affiliate then lists the products on their website and provides a Commission Junction link back to our ecommerce store which tracks any purchases with the affiliate getting a commission based on any sales via a cookie. This has been very successful for us in terms of sales but we've noticed a significant dip over the past year in ranking whilst the affiliate has achieved a peak...all eyes are pointing towards the Panda update. Whenever I type one of our 'uniquely written' product descriptions into Google, the affiliate website appears higher than ours suggesting Google has ranked them the authority. My question is, without writing unique content for the affiliate and changing the commission junction link. What would be the best option to be recognised as the authority of the content which we wrote in the first place? It always appears on our website first but Google seems to position the affiliate higher than us in the SERPS after a few weeks. The commission junction link is written like this: http://www.anrdoezrs.net/click-1428744-10475505?sid=shopp&url=http://www.outdoormegastore.co.uk/vango-calisto-600xl-tent.html
Intermediate & Advanced SEO | | gavinhoman0