Will using https across our entire site hurt our external backlinks?
-
Our site is secured throughout, so it loads sitewide as https. It is canonicalized properly - any attempt to load an existing page as http will force to https. My concern is with backlinks. We've put a lot of effort into social media, so we're getting some nice blog linkage. The problem is that the links are generally to http rather than https (understandable, since that's the default for most web users). The site still loads with no problem, but my concern is that since a redirect doesn't transfer all the link juice across, we're leaking some perfectly good link credit. From the standpoint of backlinkage, are we harming ourselves by making the whole site secure by default? The site presently isn't very big, but I'm looking at adding hundreds of new pages to the site, so if we're going to make the change, now is the time to do so. Let me know what you think!
-
We run one site with all https and there is no problem at all - we link build as usual and see no bad impacts, in fact we are doing very well.
It's not usual practice but for SEO as long as you are playing by the rules it will have no impact whatsoever.
-
Yes -- I actually just got done reverting back from HTTPS -> HTTP because of the handshake. Think about this.
- How many images does the page have? All of your images need to have SSL.
- How many styles and external style sheets? All of your style sheets need to have SSL
- Does all of the sites you link to have SSL as well? I found that if I link something it can sometimes red flag that there are elements in the page that are not secure.
It's a lot of work and a lot of maintenance and at the end: the visitor gets frustrated and leaves. Even if you are at rackspace and you have a dedicated SSL proxy server with load bouncers and it auto scales. The clients browser still needs to form a relationship with the SSL certificate for all of the images/scripts on your page.
-
your backlinks will suffer. You need to go and 301 each of the http pages to the https ones. That being said 301s do not pass 100% of link juice on and many people will continue to link to the http pages.
Do you really need every page to be https? why not just have the key data exchange pages as https and the rest as http?
-
I would seriously consider the possibility of making only as much of your site https as is really necessary.
That said, the portion of your link juice being lost due to the redirects is probably relatively insignificant. But if you could keep half the site as http, that would cut your leakage in half.
-
There's very rarely any reason to force SSL for an entire site. Any content that you're trying to SEO, obviously has no need to be encrypted.
SSL puts a huge overhead on page load time.
-
We have the same issue. Our site is 100% SSL. We use 301 redirects for any http requests to go to https instead. We rank well in the SERPs for phrases we care about. I'm pretty sure the link juice is flowing from http to https because of the 301s (many of our external links are http).
(and, SEOMoz folks: really looking forward to your crawl tool working with https sites!)
-
Don't really see a way around it. Only force HTTPS on pages that need it. If you can operate at 80% HTTP and 20% HTTPS, that is much better, as people rarely link to HTTPS pages.
So yes, change it
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can you use no-index to counter duplicate content across separate domains?
Hi Moz Community, I have a client who is splitting out a sub brand from a company website to its own domain. They have lots of content around the theme and they want to migrate most of the content out to the new domain, but they also wanted to keep that content on the main site as the main site gets lots of traffic. My question is, as they want search traffic to go to the new site, but want to keep the best content on the original site too, so it can be found in the nav, if they no-index identical content on main site and index content on the new site will they still be penalised for duplicate content? Our advice has been to keep the thematic content on both sites but make them different enough so they are not considered duplicate - we routinely write the same blog post in 50 different ways for them but their Head of Web asked if the no-index is a route, which means they don't need to pay for and wait for brand new content? They are comfortable in losing traffic until the new domain gets traction. In theory, if they are telling Google not to index or rank the main site content, the new site shouldn't be penalised but I'm not confident giving that advice as I've never been asked to do this before. Thoughts?
Technical SEO | | Algorhythm_jT0 -
Backlinks from an Association Site
My company is joining an Industrial Association. Part of the membership is a link to our site from theirs. I've found that going to their site triggers a "threat alert" through our company malware detection system and shows a link that may be infected with malware. With all of that said I have (2) questions... Since this is a paid membership, will Google penalize us for having a link to our company from this association's website? Since a link on their site has potential malware issues, should we add our link to their site or could it be harmful to us? Any helpful advice is appreciated.
Technical SEO | | SteveZero121 -
PR / News stories across multiple sites - is it still duplicate content?
I was wondering does Google make an exception for news stories where duplicate content is concerned? After all depending on the story there can be a lot of quotes and bulk blocks of the same details. Is Google intelligent enough to distinguish between general website content and actual news stories? Also like a lot of big firms we publish news stories on our website, but then they get passed on to other websites in the form of PR, and then published on other websites. So if we put it on our website, then within a few hours or the same day other websites publish the story at the same time (literally copied and pasted) - how does this affect our website in terms of duplicate content? Will Google know automatically that we published it first? Thanks!
Technical SEO | | Brabian0 -
Which address do I use for citations
Hello, When I created my google places, I entered my address and when I got my google places activated I noticed that the address google places was displaying was a short abbreviation of my address. So my question is when it comes to creating citations for my listing do I grab the address google places generated for me in the listing or the long version of my address? I've just heard when it comes to creating citations, you need to make sure it is identical across the board. I hope this makes sense. Thanks!
Technical SEO | | fbbcseo0 -
The use of robots.txt
Could someone please confirm that if I do not want to block any pages from my URL, then I do not need a robots.txt file on my site? Thanks
Technical SEO | | ICON_Malta0 -
Way to find how many sites within a given set link to a specific site?
Hi, Does anyone have an idea on how to determine how many sites within a list of 50 sites link to a specific site? Thanks!
Technical SEO | | SparkplugDigital0 -
External Links from own domain
Hi all, I have a very weird question about external links to our site from our own domain. According to GWMT we have 603,404,378 links from our own domain to our domain (see screen 1) We noticed when we drilled down that this is from disabled sub-domains like m.jump.co.za. In the past we used to redirect all traffic from sub-domains to our primary www domain. But it seems that for some time in the past that google had access to crawl some of our sub-domains, but in december 2010 we fixed this so that all sub-domain traffic redirects (301) to our primary domain. Example http://m.jump.co.za/search/ipod/ redirected to http://www.jump.co.za/search/ipod/ The weird part is that the number of external links kept on growing and is now sitting on a massive number. On 8 April 2011 we took a different approach and we created a landing page for m.jump.co.za and all other requests generated 404 errors. We added all the directories to the robots.txt and we also manually removed all the directories from GWMT. Now 3 weeks later, and the number of external links just keeps on growing: Here is some stats: 11-Apr-11 - 543 747 534 12-Apr-11 - 554 066 716 13-Apr-11 - 554 066 716 14-Apr-11 - 554 066 716 15-Apr-11 - 521 528 014 16-Apr-11 - 515 098 895 17-Apr-11 - 515 098 895 18-Apr-11 - 515 098 895 19-Apr-11 - 520 404 181 20-Apr-11 - 520 404 181 21-Apr-11 - 520 404 181 26-Apr-11 - 520 404 181 27-Apr-11 - 520 404 181 28-Apr-11 - 603 404 378 I am now thinking of cleaning the robots.txt and re-including all the excluded directories from GWMT and to see if google will be able to get rid of all these links. What do you think is the best solution to get rid of all these invalid pages. moz1.PNG moz2.PNG moz3.PNG
Technical SEO | | JacoRoux0