HTTP → HTTPS Migration - Both Websites Live Simultaneously
-
We have a situation where a vendor, who manages a great deal of our websites, is migrating their platform to HTTPS. The problem is that the HTTP & new HTTPS versions will be live simultaneously (in order to give clients time to audit both sites before the hard switch). I know this isn't the way that it should be done, but this is the problem we are facing.
My concern was that we would have two websites in the index, so I suggested that they noindex the new HTTPS website until we are ready for the switch. They told me that they would just add cannonicals to the HTTPS that points to the HTTP and when it's time for the switch reverse the cannonicals.
Is this a viable approach?
-
We will definitely employ the 301 redirects once we completely audit the HTTPS site for any absolute links. I just wanted to ensure that we wouldn't have two sites in the index during the transition. Thanks for the feedback.
-
Hey there! Great question and sounds like you are thinking through the important things.
I think the canonical approach is best, especially if URLs and content are staying the same. I would not mess around with robots.txt or noindex at this point as the canonical should keep the one out of the index and allow the other to rank.
Longterm of course, a proper 301 redirect strategy is the right solution. But short term, to test user behavior/conversions/rankings, a canonical is a great way to go.
-
Thanks.
-
Their plan to use canonicals is a great approach.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Migration developer question
Hi Guys, We are in the process of migrating our website and are moving to: AWS/Elastic Beanstalk hosting and the only way to do a custom domain with a third-party (not Amazon) DNS Service is by setting up a CNAME that points to the EBS Instance. Do you think this will impact SEO performance in any way? Cheers.
Intermediate & Advanced SEO | | cerednicenko0 -
INTERNAL LINKS strategy on our website
Hi Moz-ers, Currently doing an audit of our website. I have two questions on links. How can I see the current state of my internal links? Also, how can I improve our internal links on the website? what is a good framework to follow what should I avoid Thanks, looking forward to learning more on Moz!
Intermediate & Advanced SEO | | Eric_S
Eric0 -
Google cache is for a 3rd parties site for HTTP version and correct for HTTPS
If I search Google for my cache I get the following: cache:http://www.saucydates.com -> Returns the cache of netball.org (HTTPS page with Plesk default page) cache:https://www.saucydates.com -> Displays the correct page Prior to this my http cache was the Central Bank of Afghanistan. For most searches at present my index page is not returned and when it is, it’s the Net Ball Plesk page. This is, of course hurting my search traffic considerably. ** I have tried many things, here is the current list:** If I fetch as Google in webmaster tools the HTTPS fetch and render is correct. If I fetch the HTTP version I get a redirect (which is correct as I have a 301 HTTP to HTTPS redirect). If I turn off HTTPS on my server and remove the redirect the fetch and render for HTTP version is correct. The 301 redirect is controlled with the 301 Safe redirect option in Plesk 12.x The SSL cert is valid and with COMODO I have ensured the IP address (which is shared with a few other domains that form my sites network / functions) has a default site I have placed a site on my PTR record and ensured the HTTPS version goes back to HTTP as it doesn’t need SSL I have checked my site in Waybackwhen for 1 year and there are no hacked redirects I have checked the Netball site in Waybackwhen for 1 year, mid last year there is an odd firewall alert page. If you check the cache for the https version of the netball site you get another sites default plesk page. This happened at the same time I implemented SSL Points 6 and 7 have been done to stop the server showing a Plesk Default page as I think this could be the issue (duplicate content) ** Ideas:** Is this a 302 redirect hi-jack? Is this a Google bug? Is this an issue with duplicate content as both servers can have a default Plesk page (like millions of others!) A network of 3 sites mixed up that have plesk could be a clue? Over to the experts at MOZ, can you help? Thanks, David
Intermediate & Advanced SEO | | dmcubed0 -
How ot optimise a website for competitive keywords?
Hi guys, I hope to find some good answers to my questions, because here are some of the best SEO's in the world. I'm doing SEO as a hobby for a few years and had some very good results before the latest Google updates. Now I'm not able to rank any website for competitive keywords. The last project I started is this website (man and van hire company targeting London market).
Intermediate & Advanced SEO | | nasi_bg
The problem is that I can't rank even in Top 100 in Google UK for the main keywords like: "man and van london" , "man and van service london" ,"london man & van"...
The site has over 1k good backlinks (according to Ahrefs), unique content, titles and descriptions but still can't rank well. Am i missing something? Few years back that was more than enough to rank well in Google.
I will be very grateful to hear your suggestions and opinions.0 -
How many pages should be on landscapers website
Hi Guys, We have a good website strong onsite and offsite seo. A year ago, we had a 15 pages website for all main keywords we needed and we were on top 3 for most of these keywords in google. We were happy but we wanted more.. So we created lots of unique content targeting long tail keywords and created 100 more pages for the website. In next 4-5 months we lost positions for almost all our main keywords but got lots of longtails SERPs. Trafiic grew but the quality and the conversion rate shrinked. Everybody keep saying that it doesn't matter how many pages you have on the website as long as content is unique and I don't think it is true. I see lots of 3-5 paged websites without any seo in top 3 results in google. Does it mean that if I delete all these 100 pages that I created I will have more chances to get my main keywords SERP back? Basically does the seo juice that you have on domain is spreading across all pages and the more pages you have the less juice every page will get?
Intermediate & Advanced SEO | | vadimmarusin100 -
Understanding how to fix a 403 issue with my website
Hi guys, I hope you can help solve a mystery for me! My site FranceForFamilies.com has been around for 9 years and has always ranked well - at least until I launched a new Wordpress version earlier this year. The purpose of the relaunch was to improve the look of the site, so I kept the content and meta titles the same but created a new design. However, from the day of the new launch the search engine rankings have plummeted, to the point where most seem to have disappeared all together. I have found that when Moz crawls the site, it only crawls one page. I asked the Moz team about this and they said that the site is returning a 403. They also tested this using a curl and received a 406 response: curl -I www.franceforfamilies.com/ HTTP/1.1 406 Not Acceptable However, when I check our Google Webmaster tools I can't recreate the issue. I don't really know what is going on, and I don't have the technical knowledge to solve this - can you help? Thanks, Daniel
Intermediate & Advanced SEO | | LeDanJohnson0 -
How does a competing website with clearly black hat style SEO tactics, have a far higher domain authority than our website that only uses legitimate link building tactics?
Through SEO Moz link analysis tools, we looked at a competing websites external followed links and discovered a large number of links going to Blog pages with domain authorities in the 90's (their blog page authorities were between 40 and 60), however the single blog post written by this website was exactly the same in every instance and had been posted in August 2011. Some of these blog sites had 160 or so links linking back to this competing website whose domain authority is 49 while ours is 28, their Moz Trust is 5.43 while ours is 5.18. An example of some of the blogs that link to the competing website are: http://advocacy.mit.edu/coulter/blog/?p=13 http://pest-control-termite-inspection.posterous.com/\ However many of these links are "no follow" and yet still show up on Open Site Explorer as some of this competing websites top linking pages. Admittedly, they have 584 linking root domains while we have only 35, but if most of them are the kind of websites posted above, we don't understand how Google is rewarding them with a higher domain authority. Our website is www.anteater.com.au Are these tactics now the only way to get ahead?
Intermediate & Advanced SEO | | Peter.Huxley590 -
Changing website providers
After increasing suffering down time from my current website provider, I am seriously considering finding a new one. My only concern is the effect on SERP. Does anyone have any experience with this and what to do and avoid?
Intermediate & Advanced SEO | | casper4340