Periodic DNS Switching for Major Website Updates - Any Downsides?
-
A company is performing some major updates to a website and the proposal to go live with the updates was explained as follows:
Once the updates are done on the testing environment and the site is ready to go live, we switch the DNS to the testing environment and then this testing environment becomes the production site. And the old production site becomes the new testing environment.
Are there any potential negatives to this?
Is there a name for this technique?
Of course, we've already considered :
- additional hosting cost
- potential performance differences- reinstalling and setting up server settings - SSL, etc.
-
I don't understand how we'd lose traffic...some visitors would see old site and some would see new site until fully propagated, right?
The problem with changing DNS is an initial traffic drop as routers/hubs/ gets the update.
Quote REF: http://www.mattcutts.com/blog/moving-to-a-new-web-host/
Step 3: Change DNS to point to your new web host.
This is the actual crux of the matter. First, some DNS background. When Googlebot(s) or anyone else tries to reach or crawl your site, they look up the IP address, so mattcutts.com would map to an IP like 63.111.26.154. Googlebot tries to do reasonable things like re-check the IP address every 500 fetches or so, or re-check if more than N hours have passed. Regular people who use DNS in their browser are affected by a setting called TTL, or Time To Live. TTL is measured in seconds and it says “this IP address that you fetched will be safe for this many seconds; you can cache this IP address and not bother to look it up again for that many seconds.” After all, if you looked up the IP address for each site with every single webpage, image, JavaScript, or style sheet that you loaded, your browser would trundle along like a very slow turtle.
If you read this page you'll see Matt Cutts tested mattcutts.com himself and did not see any major impact. However, Matt Cutts has a high profile domain since he is well known for talking about his experience within Google.
The point is the test environment works perfectly right now. If the files are migrated over to the live environment, then we could have issues. But if we simply switch the DNS to the test environment, we know that it will work fine.
I would concede this point if the major updates are operating in a different test environment then the live environment. By environment I mean different server architecture, like different php / asp versions or database types/versions that the current live server can not or will not be updated to. When you create a test environment you generally want to duplicate the live environment so you can simply push the test elements live once complete.If the server architecture is part of the test then I can't argue with the logic.
-
When you switch DNS you are at the mercy of the how fast the DNS propagates through the inter web.
how fast this propagates isn't really an issue for us.
Larger sites that see a lot of traffic daily are likely indexed more frequently and would thus suffer less traffic loss.
I don't understand how we'd lose traffic...some visitors would see old site and some would see new site until fully propagated, right?
If not then why switch DNS if you don't have to?
The point is the test environment works perfectly right now. If the files are migrated over to the live environment, then we could have issues. But if we simply switch the DNS to the test environment, we know that it will work fine.
-
Switching DNS is not an optimal solution in most cases. When you switch DNS you are at the mercy of the how fast the DNS propagates through the inter web. Larger sites that see a lot of traffic daily are likely indexed more frequently and would thus suffer less traffic loss.
If your domain is performing major updates switching the DNS at the same time is even more so ill advised. Would you not want to see how these major updates affect your website first? If you switch the DNS and see a huge traffic loss then you now are left with trying to figure out "did our updates hurt" or "is this just a DNS propagation issue" or combinations thereof?
My advice, take a two tier approach if the ultimate goal is to move DNS.
Step 1: Update Site On Same DNS
Step 2: Update DNS after updates are proven to be valuable to the users.If not then why switch DNS if you don't have to?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can I define that one area of my website is a regualr news (no subscription) and the other part of the website is news that only subscribers can read?
Hi I have a client that have a news website, he asked me if he can define one area of his website to be a regular news that google can show on google news search results (no subscription) and the other part of the website is news that only subscribers can read? Thanks Roy
Intermediate & Advanced SEO | | kadut1 -
Recovery after recent Google update
Hi guys. This is somewhat a continuation for this topic: https://moz.com/community/q/january-2016-massive-rankings-fluctuations After that update, several of our clients and our website as well have experienced high fluctuation rankings period, which ended up in huge drops - 10-20 spots. Which, obviously, made everyone unhappy. Anybody knows what exactly the change was about? What should we fix/take a look at, analyze again? We aren't using any shady techniques or black hat. Everything is honest. All metrics, number of backlinks etc are going up, no major changes have been recently made. Please, help!
Intermediate & Advanced SEO | | DmitriiK0 -
Schema.org Data Appears on Website
Hello Moz World, I would like to incorporate Schema Data onto my website. My website has Meta tags in the of the document, which incorporates our keywords. And, in the footer, I have my businesses address, logo and other relevant information. Everything flows because it is stylized using CSS. When I input the schema.org data it appears on the website as text w/ a hyperlink. See Code: My Company
Intermediate & Advanced SEO | | MarketingChimp10
DESCRIPTION. STREET
CITY
STATE
ZIP
United States
Phone: NUMBER I've tried to put it inside the head, body & footer of the HTML code. I want my website to show up properly through Google Structured Data Testing Tool however, I don't want the text to show up on my website not stylized. It sticks out like a soar thumb no matter where I incorporate it. My questions are; Should I even bother with Schema.org? And, is there a way to incorporate it into my website so it does not show up and Google can still pull it up? Thanks ahead of time for all of the awesome responses! B/R Will H.0 -
Whats up with this website?
cybercig.co.uk Languishing around 150-200 in the rankings, very barely making it above 70. But also ranks for Refillable Electronic Cigarette on the first page. Any ideas whats happening? Not a huge amount of links but I'd have thought it would've been much higher. I'd love to know opinions 🙂
Intermediate & Advanced SEO | | jasondexter0 -
Website Migration and SEO
Recently I migrated three websites from www.product.com to www.brandname.com/product. Two of these site are performing as normal when it comes to SEO but one of them lost half of its traffic and dropped in rankings significantly. All pages have been properly redirected, onsite SEO is intact and optimized, and all pages are indexed by Search engines. Has anyone had experience with this type of migration that could give some input on what a possible solution could be? Any help would be greatly appreciated!
Intermediate & Advanced SEO | | AlexVelazquez0 -
Website, webshop and blog. Subfolders or subdomains?
Hello fellow mozzers, I've seen a lot of discussion and confusion about whether you should use subfolders or subdomains when you have a website, a blog and a webshop.
Intermediate & Advanced SEO | | WesleySmits
Of course with subfolders the PageRank will be more effective since it's all in one domain. On the other hand subdomains will be a better user experience since you can focus on just the webshop or just the blog. Was wondering how you guys/girls think what would be the best way to handle this.0 -
Website layout for a new website [Over 50 Pages & targeting Long Tail Keywords]
Hey everyone, We are designing a new website with over 50 pages and I have a question regarding the layout. Should I target my long tail keywords via blog pages? It will be easier to manage and list and link out to similar articles related to my long tail keywords using a word press blog. For this example - lets suppose the website is www.orange.com and we sells 'Oranges' Am I going about this in the right way? Main Section: Main Section 1 : Home Page - Keyword Targeted - Orange Main Section 2 : Important Conversion page - 'Buy oranges' Long Tail Keyword (LTK) 1: www.orange.com/blog/LTK1 Subsection(SS): www.orange.com/blog/LTK1/SS1 www.orange.com/blog/LTK1/SS1a www.orange.com/blog/LTK1/SS1b Long Tail Keyword (LTK) 2: www.orange.com/blog/LTK2 Long Tail Keyword (LTK) 3: www.orange.com/blog/LTK3 Subsection(SS): www.orange.com/blog/LTK1/SS3 www.orange.com/blog/LTK1/SS3a www.orange.com/blog/LTK1/SS3b All these long tail pages and sub sections under them are built specifically for hosting content that targets these specific long tail keywords. Most of my traffic will come initially via the sub section pages - and it is important for me to rank well for these terms initially. _E.g. if someone searches for the keyword 'SS3b' on Google - my corresponding page www.orange.com/blog/LTK1/SS3b should rank well on the results page. _ For ranking purposes - will using this blog/category structure hurt or benefit me? Instead do you think I should build static pages? Also, we are targeting more than 50 long tail keywords - and building quality content for each of these keywords - and I assume that we will be doing this continuously. So in the long term term which is more beneficial? Do you have any suggestions on if I am going about this the right way? Apologies for using these random terms - oranges, LKT, SS etc in this example. However, I hope that the question is clear. Looking forward to some interesting answers on this! Please feel free to share your thoughts.. Thank you! Natasha
Intermediate & Advanced SEO | | Natashadogres0 -
Penguin Update Issues.. What would you recommend?
Hi, We've been pretty badly hit by this penguin Update. Site traffic is down 40-50%. We suspect it's for a couple of reasons 1)Google is saying we have duplicate content. e.g. for a given category we will have 4-5 pages of content (products). So it's saying pagenum=2 , pagenum=3 etc are duplicate pages. We've implemented rel=canonical so that pagenum=2 point to the original category e.g. http://mydomain/widgets.aspx We've even specified pagenum as a url parameter that pagniates. Google still hasn't picked up these changes. How long does it take - it's been about a week 2)They've saying we have soft 404 errors. e.g. we remove a category or product we point users to a category or page not found. is it best to block googlebot from crawling these page by specifying in robots.txt. because we really don't care about these categories or product pages. How best to handle? 3)There are some bad directory and crawlers that have crawled our website but have put incorrect links . So we've got like 1700 product not found. I'm sure that's taking up a lot of crawling time. So how do we tell Google not to bother with these link coming from specific sources e.g. ignore all links coming from xxx.com. Any help will be much appreciated as this is Killing our business. Jay
Intermediate & Advanced SEO | | ConservationM0