Still Not Secure in Chrome
-
Hi
We migrated to HTTPs in November - but we still aren't showing as Secure.
I thought it was due to there being an Insecure SHA-1 script in the SSlL Certificate, so am waiting to get this fixed.
We had a few http links outstanding so they have been updated, but we're still getting the issue.
Does anyone have an idea of what it could be? https://www.key.co.uk/en/key/
-
I'm surprised to say... that SSL certificate you have is very poor quality and has a number of pretty significant security issues, in addition to the SHA-1 encryption.]
To answer your specific question, there's nothing you or your devs can do about the SHA-1 encryption problem, as that problem exists on one of the certificates in the chain that is owned and controlled by Thawte (the cert issuer or "Certificate Authority"), not your own certificate. It is up to them to fix it.
As you can see from the cert security scan, there are a number of other issues with the certificate that are unacceptable. Especially in a paid certificate. [Edited for clarity - some of those warnings are likely server-specific, meaning the server is being allowed to communicate with certificate in less than optimal ways]
https://www.ssllabs.com/ssltest/analyze.html?d=www.key.co.ukIt's unlikely that the encryption problem is whats giving the "not secure" warning on the site at the moment (although it will become a major issue later in February) so you'll need to keep looking for resources called over HTTP if you're still getting warnings.
When I had a quick look at the home page, I didn't see any more warnings, as it appears you've fixed the image call that Andrew mentioned. You can use Chrome or Firefox Dev Tools to inspect any pages that are not secure to be shown exactly what element is causing the failure. It often comes down to hardcoded images like those in CSS/background images etc, or hardcoded scripts. For example, your Quotations page is calling a script from Microsoft to validate the form, but it's failing as it's called over HTTP.
Knowing this, you'd want to check any other pages using such form validation. A thorough Screaming Frog crawl to look for any other wayward HTTP calls can also help dig our the remaining random culprits.
Hope that helps?
Paul
Sidenote: Your certificate authority is Thawte, which is connected with Symantec. Which has done such a bad job of securing their certificates that Chrome and other browsers no longer trust them and are in the near future are going to be officially distrusted and ignored. Symantec has in fact given up their Certificate Authority status and is transferring their business to a new company which does have a trusted infrastructure for issuing certificates. So you're going to need to deal with a new certificate in the not too distant future anyway.
Given the poor security of your existing cert, and the upcoming issues, if it were me, I'd be asking for a refund of my current cert, and replacing it with one from a more reliable issuer. I know that can mean a lot of extra work, but as these existing problematic certs go through the distrust process over the next 8 months, sites that haven't dealt with the issue are going to break.
It's possible that Thawte will build out a reliable process for migrating. At the very least, you need to have a strong conversation with your issuer about how to insure you are getting the security and long-term reliability you've paid for. Sorry to be the bearer of bad news that is a much bigger issue. You can read up about it more here:
https://security.googleblog.com/2017/09/chromes-plan-to-distrust-symantec.html -
Thank you.
Also, does anyone know if we need to rekey the SHA-1 signature algorithm, what we rekey it with or should my dev team know this?
-
I also got this report from https://www.whynopadlock.com
Soft FailureAn image with an insecure url of "http://www.key.co.uk/img/W/KEY/F7/IC/F7-112H204-1-LX.jpg" was loaded on line: 1 of https://www.key.co.uk/en/key.
Errors that are reported on line 1 are generally not part of the source code. This error may be caused by an external javascript file which is writing to the page, however we are unable to reliably detect these scripts in our automated test.
Please contact us using the "Need Help?" link below if you need assistance with resolving this error.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Still Need to Write Title & Description Tag?
My SEO has advised me that Google has stopped using title and description tags for search results and as such, it is no longer necessary to write specific title and description tags. I see that Yoast seems to pull text to create these tags and sometimes it looks like it reflects the best elements of the content, sometimes it does not. Should I be asking our SEO team to write dedicated title and description tags or is it best practices to leave it to the Yoast plugin? My SEO is of the opinion that writing these tags is not a productive use of time as Google will serve results based on the user inquiry rather than the content on his tags. It sounds logical but it would be reassuring to receive further confirmation of this. Thoughts?"
Intermediate & Advanced SEO | | Kingalan1
Thanks, Alan0 -
Moz metrics are better than top10 competitors but still no progress
Hi Moz friends, So once in a while I encounter a challenge that I can't figure out so I thought maybe you can help.I would like to enter the top10 in Google.nl for a specific keyword and Moz's OSE is telling me all my metrics are better than most of my competitors. Als my on-page grade is on level (A) but I miss something .. somewhere. The awkward thing is the competition level is very easy .. Hope you guys can help, Cheers, Mark
Intermediate & Advanced SEO | | newtraffic0 -
Low PA but still ranking no.1??
Can anybody shed some light on this. A competitor is ranking no.1 for the keyword "storage" which is a very competitive keyword in our industry. There url is www.publicselfstorage.com.au When using Site Explorer, l can see that they have a very low PA, low external following links etc compared to the companies ranking 2 and 3. http://www.storageking.com.au/melbourne.html http://www.kss.com.au/ How are they still managing to rank no.1?
Intermediate & Advanced SEO | | RobSchofield0 -
We recently fixed a Meta Refresh that was affecting our home page - But something still seems wrong. Any suggestions?
We recently fixed a meta refresh issue on our home page. Our store URL: http://www.ccisolutions.com had a meta refresh on it that was going to: www.ccisolutions.com/StoreFront/IAFDispatcher?iafAction=showMain The meta refresh is now gone, however there still seem to be some problems: Our IT Director has not been successful in trying to make www.ccisolutions.com/StoreFront/IAFDispatcher?iafAction=showMain 301 redirect to http://www.ccisolutions.com - so I believe we now have a duplicate content issue If you look at both of these URLs in OSE, you will see that www.ccisolutions.com/StoreFront/IAFDispatcher?iafAction=showMain is getting credit for almost all of the Internal Followed Links, while http://www.ccisolutions.com is getting all the credit for External Followed links. Why doesn't http://www.ccisolutions.com show the same number of Internal Followed Links? I realize this is more of a developer/webmaster question and would be very appreciative of any suggestions or advice. Thanks!
Intermediate & Advanced SEO | | danatanseo0 -
Resource Links Still Working for People?
We've been using an outreach method that targets resource links & improvements seem to be minor, even though links are coming from .edu's and .gov's -- has anyone else noticed this trend? Guest posting seems to work much better in terms of ranking / traffic improvements.
Intermediate & Advanced SEO | | nicole.healthline0 -
Same article published 3 times--do we still benefit from the links?
Hi, A reporter recently mentioned us in a leading publication, and that article was picked up by two other big publications. Do we benefit from all three links, or do we only benefit from the link once since it is the same article?
Intermediate & Advanced SEO | | nicole.healthline0 -
NOINDEX content still showing in SERPS after 2 months
I have a website that was likely hit by Panda or some other algorithm change. The hit finally occurred in September of 2011. In December my developer set the following meta tag on all pages that do not have unique content: name="robots" content="NOINDEX" /> It's been 2 months now and I feel I've been patient, but Google is still showing 10,000+ pages when I do a search for site:http://www.mydomain.com I am looking for a quicker solution. Adding this many pages to the robots.txt does not seem like a sound option. The pages have been removed from the sitemap (for about a month now). I am trying to determine the best of the following options or find better options. 301 all the pages I want out of the index to a single URL based on the page type (location and product). The 301 worries me a bit because I'd have about 10,000 or so pages all 301ing to one or two URLs. However, I'd get some link juice to that page, right? Issue a HTTP 404 code on all the pages I want out of the index. The 404 code seems like the safest bet, but I am wondering if that will have a negative impact on my site with Google seeing 10,000+ 404 errors all of the sudden. Issue a HTTP 410 code on all pages I want out of the index. I've never used the 410 code and while most of those pages are never coming back, eventually I will bring a small percentage back online as I add fresh new content. This one scares me the most, but am interested if anyone has ever used a 410 code. Please advise and thanks for reading.
Intermediate & Advanced SEO | | NormanNewsome0 -
Why do old URL format are still being crawled by Rogerbot?
Hi, In the early days of my blog, I used permalinks with the following format: http://www.mysitesamp.com/2009/02/04/heidi-cortez-photo-shoot/ I then decided to change this format using .htaccess to this format: http://www.mysitesamp.com//heidi-cortez-photo-shoot/ My question is, why do rogerbot still crawls my old URL format since these urls' no longer exists in my website or blog.
Intermediate & Advanced SEO | | Trigun0