Still Not Secure in Chrome
-
Hi
We migrated to HTTPs in November - but we still aren't showing as Secure.
I thought it was due to there being an Insecure SHA-1 script in the SSlL Certificate, so am waiting to get this fixed.
We had a few http links outstanding so they have been updated, but we're still getting the issue.
Does anyone have an idea of what it could be? https://www.key.co.uk/en/key/
-
I'm surprised to say... that SSL certificate you have is very poor quality and has a number of pretty significant security issues, in addition to the SHA-1 encryption.]
To answer your specific question, there's nothing you or your devs can do about the SHA-1 encryption problem, as that problem exists on one of the certificates in the chain that is owned and controlled by Thawte (the cert issuer or "Certificate Authority"), not your own certificate. It is up to them to fix it.
As you can see from the cert security scan, there are a number of other issues with the certificate that are unacceptable. Especially in a paid certificate. [Edited for clarity - some of those warnings are likely server-specific, meaning the server is being allowed to communicate with certificate in less than optimal ways]
https://www.ssllabs.com/ssltest/analyze.html?d=www.key.co.ukIt's unlikely that the encryption problem is whats giving the "not secure" warning on the site at the moment (although it will become a major issue later in February) so you'll need to keep looking for resources called over HTTP if you're still getting warnings.
When I had a quick look at the home page, I didn't see any more warnings, as it appears you've fixed the image call that Andrew mentioned. You can use Chrome or Firefox Dev Tools to inspect any pages that are not secure to be shown exactly what element is causing the failure. It often comes down to hardcoded images like those in CSS/background images etc, or hardcoded scripts. For example, your Quotations page is calling a script from Microsoft to validate the form, but it's failing as it's called over HTTP.
Knowing this, you'd want to check any other pages using such form validation. A thorough Screaming Frog crawl to look for any other wayward HTTP calls can also help dig our the remaining random culprits.
Hope that helps?
Paul
Sidenote: Your certificate authority is Thawte, which is connected with Symantec. Which has done such a bad job of securing their certificates that Chrome and other browsers no longer trust them and are in the near future are going to be officially distrusted and ignored. Symantec has in fact given up their Certificate Authority status and is transferring their business to a new company which does have a trusted infrastructure for issuing certificates. So you're going to need to deal with a new certificate in the not too distant future anyway.
Given the poor security of your existing cert, and the upcoming issues, if it were me, I'd be asking for a refund of my current cert, and replacing it with one from a more reliable issuer. I know that can mean a lot of extra work, but as these existing problematic certs go through the distrust process over the next 8 months, sites that haven't dealt with the issue are going to break.
It's possible that Thawte will build out a reliable process for migrating. At the very least, you need to have a strong conversation with your issuer about how to insure you are getting the security and long-term reliability you've paid for. Sorry to be the bearer of bad news that is a much bigger issue. You can read up about it more here:
https://security.googleblog.com/2017/09/chromes-plan-to-distrust-symantec.html -
Thank you.
Also, does anyone know if we need to rekey the SHA-1 signature algorithm, what we rekey it with or should my dev team know this?
-
I also got this report from https://www.whynopadlock.com
Soft FailureAn image with an insecure url of "http://www.key.co.uk/img/W/KEY/F7/IC/F7-112H204-1-LX.jpg" was loaded on line: 1 of https://www.key.co.uk/en/key.
Errors that are reported on line 1 are generally not part of the source code. This error may be caused by an external javascript file which is writing to the page, however we are unable to reliably detect these scripts in our automated test.
Please contact us using the "Need Help?" link below if you need assistance with resolving this error.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Why does my old brand name still show up on organic search but as my new brand name and domain?
Hello mozers! I have quite the conundrum. My client used to have the unfortunate brand name "Meetoo" - which by the way they had before the movement happened! So naturally, they rebranded to the name Vevox in March 2019 to avoid confusion to users. However, when you search for their old brand name "Meetoo" the first organic link that pops up is their domain www.vevox.com. Now, this wouldn't normally be a problem, however it is when any #MeToo news appears in the media and we get a sudden influx or wrong traffic. I've searched the HTML and content for the term "Meetoo" but can only find one trace of this name through a widget. Not enough to hold an organic spot. My only other thinking is that www.vevox.com is redirected from www.meetoo.com. So I'm assuming this is why Vevox appear under the search term "Meetoo". How can I remove the homepage www.vevox.com from appearing for the search term "meetoo"? Can anyone help? AvGGYBc
Intermediate & Advanced SEO | | Virginia-Girtz3 -
Removed everything from my webpage still not de-ranked
Hi, How long is the delay to de-rank once you remove everything from a page ? Thank you,
Intermediate & Advanced SEO | | seoanalytics0 -
Redirected Old Pages Still Indexed
Hello, we migrated a domain onto a new Wordpress site over a year ago. We redirected (with plugin: simple 301 redirects) all the old urls (.asp) to the corresponding new wordpress urls (non-.asp). The old pages are still indexed by Google, even though when you click on them you are redirected to the new page. Can someone tell me reasons they would still be indexed? Do you think it is hurting my rankings?
Intermediate & Advanced SEO | | phogan0 -
Community Discussion - What old-school SEO tactics no longer work? Which ones still do?
Hi there, friends! This week's discussion comes from today's Whiteboard Friday: Rand's outlined SEO practices that are outdated and no longer effective. Did anything catch you off-guard, making you want to pivot your strategy? Anything that you disagree with, or that you feel still works well regardless? What other tactics, in your experience, no longer work?
Intermediate & Advanced SEO | | MattRoney4 -
Links to my site still showing in Webmaster Tools from a non-existent site
We owned 2 sites, with the pages on Site A all linking over to similar pages on Site B. We wanted to remove the links from Site A to Site B, so we redirected all the links on Site A to the homepage on Site A, and took Site A down completely. Unfortunately we are still seeing the links from Site A coming through on Google Webmaster Tools for Site B. Does anybody know what else we can do to remove these links?
Intermediate & Advanced SEO | | pedstores0 -
Why the archive sub pages are still indexed by Google?
Why the archive sub pages are still indexed by Google? I am using the WordPress SEO by Yoast, and selected the needed option to get these pages no-index in order to avoid the duplicate content.
Intermediate & Advanced SEO | | MichaelNewman1 -
Should I let Google crawl my production server if the site is still under development?
I am building out a brand new site. It's built on Wordpress so I've been tinkering with the themes and plug-ins on the production server. To my surprise, less than a week after installing Wordpress, I have pages in the index. I've seen advice in this forum about blocking search bots from dev servers to prevent duplicate content, but this is my production server so it seems like a bad idea. Any advice on the best way to proceed? Block or no block? Or something else? (I know how to block, so I'm not looking for instructions). We're around 3 months from officially launching (possibly less). We'll start to have real content on the site some time in June, even though we aren't planning to launch. We should have a development environment ready in the next couple of weeks. Thanks!
Intermediate & Advanced SEO | | DoItHappy0 -
Old deleted sitemap still shown in webmaster tools
Hello I have redisgned a website inl new url structure in cms. Old sitemap was not set to 404 but changed with new sitemap files,also new sitemap was named different to old one.All redirections done properly Still 3 month after google still shows me duplicate titile and metas by comparing old and new urls I am lost in what to do now to eliminate the shown error. How can google show urls that are not shown in sitemap any more? Looking forward to any help Michelles
Intermediate & Advanced SEO | | Tit0