Will using https across our entire site hurt our external backlinks?
-
Our site is secured throughout, so it loads sitewide as https. It is canonicalized properly - any attempt to load an existing page as http will force to https. My concern is with backlinks. We've put a lot of effort into social media, so we're getting some nice blog linkage. The problem is that the links are generally to http rather than https (understandable, since that's the default for most web users). The site still loads with no problem, but my concern is that since a redirect doesn't transfer all the link juice across, we're leaking some perfectly good link credit. From the standpoint of backlinkage, are we harming ourselves by making the whole site secure by default? The site presently isn't very big, but I'm looking at adding hundreds of new pages to the site, so if we're going to make the change, now is the time to do so. Let me know what you think!
-
We run one site with all https and there is no problem at all - we link build as usual and see no bad impacts, in fact we are doing very well.
It's not usual practice but for SEO as long as you are playing by the rules it will have no impact whatsoever.
-
Yes -- I actually just got done reverting back from HTTPS -> HTTP because of the handshake. Think about this.
- How many images does the page have? All of your images need to have SSL.
- How many styles and external style sheets? All of your style sheets need to have SSL
- Does all of the sites you link to have SSL as well? I found that if I link something it can sometimes red flag that there are elements in the page that are not secure.
It's a lot of work and a lot of maintenance and at the end: the visitor gets frustrated and leaves. Even if you are at rackspace and you have a dedicated SSL proxy server with load bouncers and it auto scales. The clients browser still needs to form a relationship with the SSL certificate for all of the images/scripts on your page.
-
your backlinks will suffer. You need to go and 301 each of the http pages to the https ones. That being said 301s do not pass 100% of link juice on and many people will continue to link to the http pages.
Do you really need every page to be https? why not just have the key data exchange pages as https and the rest as http?
-
I would seriously consider the possibility of making only as much of your site https as is really necessary.
That said, the portion of your link juice being lost due to the redirects is probably relatively insignificant. But if you could keep half the site as http, that would cut your leakage in half.
-
There's very rarely any reason to force SSL for an entire site. Any content that you're trying to SEO, obviously has no need to be encrypted.
SSL puts a huge overhead on page load time.
-
We have the same issue. Our site is 100% SSL. We use 301 redirects for any http requests to go to https instead. We rank well in the SERPs for phrases we care about. I'm pretty sure the link juice is flowing from http to https because of the 301s (many of our external links are http).
(and, SEOMoz folks: really looking forward to your crawl tool working with https sites!)
-
Don't really see a way around it. Only force HTTPS on pages that need it. If you can operate at 80% HTTP and 20% HTTPS, that is much better, as people rarely link to HTTPS pages.
So yes, change it
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is having one blanket h1 tag across my site bad?
I've been working on figuring out if my site host (jigsy) is hindering my seo. The site editor doesn't allow me to assign individual h1 tags for each page, it just blankets one across the whole site as part of a theme.....is this normal?...would it be better to have individual h1s?....is that a thing? Or should I work on making a better one since all my pages should technically be related to counseling/therapy? Site: www.joeborders.com
Technical SEO | | joebordersmft1 -
Will my site get devalued if I add the same company schema to all the pages of my website?
If I add the exact same schema markup to every page on my website - is it considered duplicate content? Our CMS is telling me that if I want schema mark-up on our site that it has to be the same on every page on the website. This limitation is frustrating but I am trying to figure out the best way to work within their boundaries. Your help is appreciated.
Technical SEO | | Annette_Wetzel0 -
Can I Block https URLs using Host directive in robots.txt?
Hello Moz Community, Recently, I have found that Google bots has started crawling HTTPs urls of my website which is increasing the number of duplicate pages at our website. Instead of creating a separate robots.txt file for https version of my website, can I use Host directive in the robots.txt to suggest Google bots which is the original version of the website. Host: http://www.example.com I was wondering if this method will work and suggest Google bots that HTTPs URLs are the mirror of this website. Thanks for all of the great responses! Regards,
Technical SEO | | TJC.co.uk
Ramendra0 -
Cache Not Working on Our Site
We redesigned our site (www.motivators.com) back in April. Ever since then, we can't view the cache. It loads as a blank, white page but the cache text is at the top saying: "This is Google's cache of http://www.motivators.com/. It is a snapshot of the page as it appeared on Jul 22, 2013 15:50:40 GMT. The current page could have changed in the meantime. Learn more. Tip: To quickly find your search term on this page, press Ctrl+F or ⌘-F (Mac) and use the find bar." Has anyone else ever seen this happen? Any ideas as to why it's happening? Could it be hurting us? Advice, tips, suggestions would be very much appreciated!
Technical SEO | | Motivators0 -
Am I using 301 correctly?
Hello, I have a 'Free download' type site for free graphics for designers. To prevent hot linking we authenticate the downloads and use a 301 redirect. So for example: The download URL looks like this if someone is clicking on the download button: http://www.website.com**/resources/243-name-of-the-file/download/dc37** and then we 301 that URL back to: http://www.website.com**/category-name/243-name-of-the-file** Is a 301 the correct way to do that?
Technical SEO | | shawn810 -
Dear Support, my client is a large bank and his site is https. and seomoz does not give any data. what can i do ? thank you
Dear Support, what to do in case of https pages. seomoz does not give any data about it. please help asap. Thank you.
Technical SEO | | SebestynMrton0 -
301 an old site to a newer site...
Hi First, to be upfront - these are not my websites, I'm asking because they are trying to compete in my niche. Here's the details, then the questions... There is a website that is a few months old with about 200 indexed pages and about 20 links, call this newsite.com There is a website that is a few years old with over 10,000 indexed pages and over 20,000 links, call this oldsite.com newsite.com acquired oldsite.com and set a 301 redirect so every page of oldsite.com is re-directed to the front page of newsite.com newsite.com & oldsite.com are on the same topic, the 301 occurred in the past week. Now, oldsite.com is out of the SERPs and newsite.com is pretty much ranking in the same spot (top 10) for the main term. Here are my questions; 1. The 10,000 pages on oldsite.com had plenty of internal links - they no longer exists, so I imagine when the dust settles - it will be like oldsite.com is a one page site that re-diretcts to newsite.com ... How long will a ranking boost last for? 2. With the re-direct setup to completely forget about the structure and content of oldsite.com, it's clear to me that it was setup to pass the 'Link Juice' from oldsite.com to newsite.com ... Do the major SE's see this as a form of SPAM (manipulating the rankings), or do they see it as a good way to combine two or more websites? 3. Does this work? Is everybody doing it? Should I be doing it? ... or are there better ways for me to combat this type of competition (eg we could make a lot of great content for the money spent buying oldsite.com - but we certainly wouldn't get such an immediate increase to traffic)?
Technical SEO | | RR5000 -
External Links from own domain
Hi all, I have a very weird question about external links to our site from our own domain. According to GWMT we have 603,404,378 links from our own domain to our domain (see screen 1) We noticed when we drilled down that this is from disabled sub-domains like m.jump.co.za. In the past we used to redirect all traffic from sub-domains to our primary www domain. But it seems that for some time in the past that google had access to crawl some of our sub-domains, but in december 2010 we fixed this so that all sub-domain traffic redirects (301) to our primary domain. Example http://m.jump.co.za/search/ipod/ redirected to http://www.jump.co.za/search/ipod/ The weird part is that the number of external links kept on growing and is now sitting on a massive number. On 8 April 2011 we took a different approach and we created a landing page for m.jump.co.za and all other requests generated 404 errors. We added all the directories to the robots.txt and we also manually removed all the directories from GWMT. Now 3 weeks later, and the number of external links just keeps on growing: Here is some stats: 11-Apr-11 - 543 747 534 12-Apr-11 - 554 066 716 13-Apr-11 - 554 066 716 14-Apr-11 - 554 066 716 15-Apr-11 - 521 528 014 16-Apr-11 - 515 098 895 17-Apr-11 - 515 098 895 18-Apr-11 - 515 098 895 19-Apr-11 - 520 404 181 20-Apr-11 - 520 404 181 21-Apr-11 - 520 404 181 26-Apr-11 - 520 404 181 27-Apr-11 - 520 404 181 28-Apr-11 - 603 404 378 I am now thinking of cleaning the robots.txt and re-including all the excluded directories from GWMT and to see if google will be able to get rid of all these links. What do you think is the best solution to get rid of all these invalid pages. moz1.PNG moz2.PNG moz3.PNG
Technical SEO | | JacoRoux0