Pro's & contra's: http vs https
-
Hi there,
We are planning to take the step and go from http to https. The main reason to do this, is to mean trustfull to our clients. And of course the rumours that it would be better for ranking (in the future).
We have a large e-commerce site. A part of this site ia already HTTPS.
I've read a lot of info about pro's and contra's, also this MOZ article: http://moz.com/blog/seo-tips-https-ssl
But i want to know some experience from others who already done this. What did you encountered when changing to HTTPS, did you had ranking drops, or loss of links etc?I want to make a list form pro's and contra's and things we have to do in advance.
Thanx, Leonie
-
We don't use Comscore. Analytics transparently kept tracking everything without any change. We don't use Tagmanager url matching tracking, but unless you have not defined rules which include the url protocol it should not need any attention either.
-
Hi, did you encountered problems with other tools, like Google Analytics and or Tagmanager, Comscore?
Thanx, Leonie
-
We have expensive certificates now for the payed section, i think we'll use the same
i'll ask about the server support SNI, not sure about that, thanx
-
In case you choose the most expensive EV certificates as we did, for whatever is not directly visible, like the cdn serving js, css and images you can just use cheap 8 $/€ certificates.
One thing I forgot, if your server support SNI, don't use it.
We did initially, but soon found out some price engines could not read feeds, moz crawler could not crawl, and everyone on XP+IE was left out. So we disabled it.
-
Hi Max, Thansk, and good to read that you didn't lost ranking. that's my concern and also the backlinking. although you should say with a redirect all the external links i can't control will redirect to https.
We have 2 different ssl certificates now, we are looking for what we need and if we have the right ones.
If i've finished the plan and list i'll think i'll publish it here
Grtz, Leonie
-
I did it a month and half ago for a couple of websites.
Transition was smooth. I had to buy more ssl certificates than I thought for the many domains serving js css and so on... But was not a big hassle.
Just after moving from http to https I didn't notice any ranking change, and to have a good level of accurancy I monitor the same keywords with both moz ranktracker, proranktracker and semrush.
But in fact google is slowly recognizing the move few urls at time, each day you will notice some google serp start serving https url in place of http ones.
After a month we had a big jump in ranking, around +30% more keywords in the top 100 and a general increase in ranking for all the keywords already in top 10, top 30 and top 50.
But I have no idea if it's connected with the shift to https since we also constantly do many other things, get backlinks, improve on-page, etc...
At least it didn't seem to penalize the websites.
-
Hi Pixelbypixel, thanx for your reply.
Right now i'm making a plan for the switch, i'm not in a rush, so i really want to make it all clear before we go, or maybe decide not to..
I don't think most of our clients know what's secure and what isn't. But we want the opportunity to comunicate about this with our clients, something we don't have right now (only when the order something)
The ranking factor, what i read about it, is not a big thing at this moment, but indeed, in the future and can be a bigger one, so that's also a good reason to go.
Thanx for the linked articles!
Grtz, Leonie
-
I'm going to give my opinion more than a list of pros and cons, most people who switch over tend to see a drop in traffic and if you don't ensure you get it all right it can be a nightmare so make sure you've got your plan ready.
Are you sure most "clients" know what https is? Most people outside our world have no idea what it is combine with the fact that the so called ranking boost has yet to be well documented it can be fairly certain its tiny.
Now it is possible that your clients know what it is and will see it and go to your site but in most cases I suspect like the ranking boost other factors would play a bigger role. My advice is to really make sure you have all the bases covered for your transfer. Also wanted to point out in the future it may be a bigger factor.
As for advice on people who have already done it oodles of info here on Moz here are a few -
http://moz.com/community/q/http-to-https-transition-large-drop-in-search-traffic
http://moz.com/community/q/https-sitewide-move-has-resulted-in-huge-rankings-drop
http://moz.com/community/q/authority-site-drastic-ranking-drop-after-google-https-switch-please-help
Obviously people tend to come here for problems more than a shout out for how great it is so don't take that as a massive negative and all of the above is my opinion I'm sure some others will give other opinions and I don't want you to be put off just to be aware that there is a lot to cover in a switch over.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Link to AMP VS AMP Google Cache VS Standard page?
Hi guys, During the link building strategy, which version should i prefer as a destination between: to the normal version (php page) to the Amp page of the Website to the Amp page of Google Cache The main doubt is between AMP of the website or standard Version. Does the canonical meta equals the situation or there is a better solution? Thank you so mutch!
Technical SEO | | Dante_Alighieri0 -
Site's IP showing WMT 'Links to My Site'
I have been going through, disavowing spam links in WMT and one of my biggest referral sources is our own IP address. Site: Covers.com
Technical SEO | | evansluke
IP: 208.68.0.72 We have recently fixed a number of 302 redirects, but the number of links actually seems to be increasing. Is this something I should ignore / disavow / fix using a redirect?0 -
Why is Google's cache preview showing different version of webpage (i.e. not displaying content)
My URL is: http://www.fslocal.comRecently, we discovered Google's cached snapshots of our business listings look different from what's displayed to users. The main issue? Our content isn't displayed in cached results (although while the content isn't visible on the front-end of cached pages, the text can be found when you view the page source of that cached result).These listings are structured so everything is coded and contained within 1 page (e.g. http://www.fslocal.com/toronto/auto-vault-canada/). But even though the URL stays the same, we've created separate "pages" of content (e.g. "About," "Additional Info," "Contact," etc.) for each listing, and only 1 "page" of content will ever be displayed to the user at a time. This is controlled by JavaScript and using display:none in CSS. Why do our cached results look different? Why would our content not show up in Google's cache preview, even though the text can be found in the page source? Does it have to do with the way we're using display:none? Are there negative SEO effects with regards to how we're using it (i.e. we're employing it strictly for aesthetics, but is it possible Google thinks we're trying to hide text)? Google's Technical Guidelines recommends against using "fancy features such as JavaScript, cookies, session IDs, frames, DHTML, or Flash." If we were to separate those business listing "pages" into actual separate URLs (e.g. http://www.fslocal.com/toronto/auto-vault-canada/contact/ would be the "Contact" page), and employ static HTML code instead of complicated JavaScript, would that solve the problem? Any insight would be greatly appreciated.Thanks!
Technical SEO | | fslocal0 -
How to solve the meta : A description for this result is not available because this site's robots.txt. ?
Hi, I have many URL for commercialization that redirects 301 to an actual page of my companies' site. My URL provider say that the load for those request by bots are too much, they put robots text on the redirection server ! Strange or not? Now I have a this META description on all my URL captains that redirect 301 : A description for this result is not available because this site's robots.txt. If you have the perfect solutions could you share it with me ? Thank You.
Technical SEO | | Vale70 -
Page has a 301 redirect, now we want to move it back to it's original place
Hi - This is the first time I've asked a question! My site, www.turnkeylandlords.co.uk is going through a bit of a redesign (for the 2nd time since it launched in July 2012...) First redesign meant we needed to move a page (https://www.turnkeylandlords.co.uk/about-turnkey-mortgages/conveyancing/) from the root to the 'about-us' section. We implemented a 301 redirect and everything went fine. I found out yesterday that the plan is to move this page (and another one as well, but it's the same issue so no point in sharing the URL) back to the root. What do I do? A new 301? Wouldn't this create a loop? Or just delete the original 301? Thanks in advance, Amelia
Technical SEO | | CommT0 -
Creating in-text links with ' 'target=_blank' - helping/hurting SEO!?!
Good Morning Mozzers, I have a question regarding a new linking strategy I'm trying to implement at my organization. We publish 'digital news magazines' that oftentimes have in-text links that point to external sites. More recently, the editorial department and me (SEO) conferred on some ways to reduce our bounce rate and increase time on page. One of the suggestions I offered is to add the 'target=_blank" attribute to all the links so that site visitors don't necessarily have to leave the site in order to view the link. It has, however, come to my attention that this can have some very negative effects on my SEO program, most notably, (fake or inaccurate) time(s) on-page. Is this an advisable way to create in-text links? Are there any other negative effects that I can expect from implementing such a strategy?
Technical SEO | | NiallSmith0 -
Google's "cache:" operator is returning a 404 error.
I'm doing the "cache:" operator on one of my sites and Google is returning a 404 error. I've swapped out the domain with another and it works fine. Has anyone seen this before? I'm wondering if G is crawling the site now? Thx!
Technical SEO | | AZWebWorks0 -
Do you get credit for an external link that points to a page that's being blocked by robots.txt
Hi folks, No one, including me seems to actually know what happens!? To repeat: If site A links to /home.html on site B and site B blocks /home.html in Robots.txt, does site B get credit for that link? Does the link pass PageRank? Will Google still crawl through it? Does the domain get some juice, but not the page? I know there's other ways of doing this properly, but it is interesting no?
Technical SEO | | DaveSottimano0