Will using https across our entire site hurt our external backlinks?
-
Our site is secured throughout, so it loads sitewide as https. It is canonicalized properly - any attempt to load an existing page as http will force to https. My concern is with backlinks. We've put a lot of effort into social media, so we're getting some nice blog linkage. The problem is that the links are generally to http rather than https (understandable, since that's the default for most web users). The site still loads with no problem, but my concern is that since a redirect doesn't transfer all the link juice across, we're leaking some perfectly good link credit. From the standpoint of backlinkage, are we harming ourselves by making the whole site secure by default? The site presently isn't very big, but I'm looking at adding hundreds of new pages to the site, so if we're going to make the change, now is the time to do so. Let me know what you think!
-
We run one site with all https and there is no problem at all - we link build as usual and see no bad impacts, in fact we are doing very well.
It's not usual practice but for SEO as long as you are playing by the rules it will have no impact whatsoever.
-
Yes -- I actually just got done reverting back from HTTPS -> HTTP because of the handshake. Think about this.
- How many images does the page have? All of your images need to have SSL.
- How many styles and external style sheets? All of your style sheets need to have SSL
- Does all of the sites you link to have SSL as well? I found that if I link something it can sometimes red flag that there are elements in the page that are not secure.
It's a lot of work and a lot of maintenance and at the end: the visitor gets frustrated and leaves. Even if you are at rackspace and you have a dedicated SSL proxy server with load bouncers and it auto scales. The clients browser still needs to form a relationship with the SSL certificate for all of the images/scripts on your page.
-
your backlinks will suffer. You need to go and 301 each of the http pages to the https ones. That being said 301s do not pass 100% of link juice on and many people will continue to link to the http pages.
Do you really need every page to be https? why not just have the key data exchange pages as https and the rest as http?
-
I would seriously consider the possibility of making only as much of your site https as is really necessary.
That said, the portion of your link juice being lost due to the redirects is probably relatively insignificant. But if you could keep half the site as http, that would cut your leakage in half.
-
There's very rarely any reason to force SSL for an entire site. Any content that you're trying to SEO, obviously has no need to be encrypted.
SSL puts a huge overhead on page load time.
-
We have the same issue. Our site is 100% SSL. We use 301 redirects for any http requests to go to https instead. We rank well in the SERPs for phrases we care about. I'm pretty sure the link juice is flowing from http to https because of the 301s (many of our external links are http).
(and, SEOMoz folks: really looking forward to your crawl tool working with https sites!)
-
Don't really see a way around it. Only force HTTPS on pages that need it. If you can operate at 80% HTTP and 20% HTTPS, that is much better, as people rarely link to HTTPS pages.
So yes, change it
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Sitemap use for very large forum-based community site
I work on a very large site with two main types of content, static landing pages for products, and a forum & blogs (user created) under each product. Site has maybe 500k - 1 million pages. We do not have a sitemap at this time.
Technical SEO | | CommManager
Currently our SEO discoverability in general is good, Google is indexing new forum threads within 1-5 days roughly. Some of the "static" landing pages for our smaller, less visited products however do not have great SEO.
Question is, could our SEO be improved by creating a sitemap, and if so, how could it be implemented? I see a few ways to go about it: Sitemap includes "static" product category landing pages only - i.e., the product home pages, the forum landing pages, and blog list pages. This would probably end up being 100-200 URLs. Sitemap contains the above but is also dynamically updated with new threads & blog posts. Option 2 seems like it would mean the sitemap is unmanageably long (hundreds of thousands of forum URLs). Would a crawler even parse something that size? Or with Option 1, could it cause our organically ranked pages to change ranking due to Google re-prioritizing the pages within the sitemap?
Not a lot of information out there on this topic, appreciate any input. Thanks in advance.0 -
Could using our homepage Google +1's site wide harm our website?
Hello Moz! We currently have the number of Google +1's for our homepage displaying on all pages of our website. Could this be viewed as black hat/manipulative by Google, and result in harming our website? Thanks in advance!
Technical SEO | | TheDude0 -
2 sites using 1 CMS... issues?
Hi, We are working with a client that has 2 sites in the same sector. They are currently on separate servers, with separate blogs, images galleries etc. Both sites rank combined for over 200 terms. IF we were to "combine" the sites on one CMS, with one IP, two separate front ends, one blog stream, one image gallery what do you think the SEO impact would be from this? We had an issue with another client whose sites were too close and we had to separate in order to get them both to rank. Further to this we want both sites to now have their own https certificate however this wouldn't be possible if combined. Interested to hear thoughts on this. Thanks
Technical SEO | | lauratagdigital0 -
Are backlinks the reason for my site's much lower SERP ranking, despite similar content?
Hi all, I'm trying to determine why my site (surfaceoptics.com) ranks so much lower than my competitor's sites. I do not believe the site / page content explains this differential in ranking, and I've done on-site / on-page SEO work without much or any improvement. In fact I believe my site is very similar in quality to competitor sites that rank much higher for my target keyword of: hyperspectral imaging. This leads me to believe there is a technical problem with the site that I'm not seeing, or that the answer lies in our backlink profile. The problem is that I've compared our site with 4 of our competitors in the Open Site Explorer and I'm not seeing a strong trend when it comes to backlinks either. Some competitors have more links / better backlink profiles but then other sites have no external links to their pages and lower PA and DA and still outrank us by 30+ positions. How should I go about determining if the problem is backlinks or some technical issue with the site?
Technical SEO | | erin_soc0 -
My beta site (beta.website.com) has been inadvertently indexed. Its cached pages are taking traffic away from our real website (website.com). Should I just "NO INDEX" the entire beta site and if so, what's the best way to do this? Please advise.
My beta site (beta.website.com) has been inadvertently indexed. Its cached pages are taking traffic away from our real website (website.com). Should I just "NO INDEX" the entire beta site and if so, what's the best way to do this? Are there any other precautions I should be taking? Please advise.
Technical SEO | | BVREID0 -
Does anyone use Ning (Custom social site)?
We just got a custom Ning site developed and the second we moved it to our domain, we lost almost all of our rankings in Google. So, we sign up for a SEOMoz Pro account and within a week we see that we have over 6,000 errors and warnings on our site. Duplicate content, duplicate page titles and pages with too many links. Is anyone using Ning and successfully maintaining 1st page rankings in Google? Seems to me that we need hundreds of fixes and modifications in the code of our site to fix all of these errors and get Google to take our penalizations off.
Technical SEO | | danlok0 -
Are lots of links from an external site to non-existant pages on my site harmful?
Google Webmaster Tools is reporting a heck of a lot of 404s which are due to an external site linking incorrectly to my site. The site itself has scraped content from elsewhere and has created 100's of malformed URLs. Since it unlikely I will have any joy having these linked removed by the creator of the site, I'd like to know how much damage this could be doing, and if so, is there is anything I can do to minimise the impact? Thanks!
Technical SEO | | Nobody15569050351140