Still Not Secure in Chrome
-
Hi
We migrated to HTTPs in November - but we still aren't showing as Secure.
I thought it was due to there being an Insecure SHA-1 script in the SSlL Certificate, so am waiting to get this fixed.
We had a few http links outstanding so they have been updated, but we're still getting the issue.
Does anyone have an idea of what it could be? https://www.key.co.uk/en/key/
-
I'm surprised to say... that SSL certificate you have is very poor quality and has a number of pretty significant security issues, in addition to the SHA-1 encryption.]
To answer your specific question, there's nothing you or your devs can do about the SHA-1 encryption problem, as that problem exists on one of the certificates in the chain that is owned and controlled by Thawte (the cert issuer or "Certificate Authority"), not your own certificate. It is up to them to fix it.
As you can see from the cert security scan, there are a number of other issues with the certificate that are unacceptable. Especially in a paid certificate. [Edited for clarity - some of those warnings are likely server-specific, meaning the server is being allowed to communicate with certificate in less than optimal ways]
https://www.ssllabs.com/ssltest/analyze.html?d=www.key.co.ukIt's unlikely that the encryption problem is whats giving the "not secure" warning on the site at the moment (although it will become a major issue later in February) so you'll need to keep looking for resources called over HTTP if you're still getting warnings.
When I had a quick look at the home page, I didn't see any more warnings, as it appears you've fixed the image call that Andrew mentioned. You can use Chrome or Firefox Dev Tools to inspect any pages that are not secure to be shown exactly what element is causing the failure. It often comes down to hardcoded images like those in CSS/background images etc, or hardcoded scripts. For example, your Quotations page is calling a script from Microsoft to validate the form, but it's failing as it's called over HTTP.
Knowing this, you'd want to check any other pages using such form validation. A thorough Screaming Frog crawl to look for any other wayward HTTP calls can also help dig our the remaining random culprits.
Hope that helps?
Paul
Sidenote: Your certificate authority is Thawte, which is connected with Symantec. Which has done such a bad job of securing their certificates that Chrome and other browsers no longer trust them and are in the near future are going to be officially distrusted and ignored. Symantec has in fact given up their Certificate Authority status and is transferring their business to a new company which does have a trusted infrastructure for issuing certificates. So you're going to need to deal with a new certificate in the not too distant future anyway.
Given the poor security of your existing cert, and the upcoming issues, if it were me, I'd be asking for a refund of my current cert, and replacing it with one from a more reliable issuer. I know that can mean a lot of extra work, but as these existing problematic certs go through the distrust process over the next 8 months, sites that haven't dealt with the issue are going to break.
It's possible that Thawte will build out a reliable process for migrating. At the very least, you need to have a strong conversation with your issuer about how to insure you are getting the security and long-term reliability you've paid for. Sorry to be the bearer of bad news that is a much bigger issue. You can read up about it more here:
https://security.googleblog.com/2017/09/chromes-plan-to-distrust-symantec.html -
Thank you.
Also, does anyone know if we need to rekey the SHA-1 signature algorithm, what we rekey it with or should my dev team know this?
-
I also got this report from https://www.whynopadlock.com
Soft FailureAn image with an insecure url of "http://www.key.co.uk/img/W/KEY/F7/IC/F7-112H204-1-LX.jpg" was loaded on line: 1 of https://www.key.co.uk/en/key.
Errors that are reported on line 1 are generally not part of the source code. This error may be caused by an external javascript file which is writing to the page, however we are unable to reliably detect these scripts in our automated test.
Please contact us using the "Need Help?" link below if you need assistance with resolving this error.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Are these Magento security concerns urgent?
Hey Mozzers! I recently started working with a new Magento programmer for our ecommerce site. He sent me this scan/report outlining some security issues that need to be addressed. This is a new partnership so I'm not sure which issues should be a major concern, or if I should not focus on them. Would you be able to give me your opinion on the importance of the security issues? https://www.magereport.com/scan/?s=http://metallumcreations.com/
Intermediate & Advanced SEO | | localwork0 -
Why is our pagerank is still only 3/10?
Hi, Our site https://soundbetter.com has been live for 2 years now, and as of yet we haven't yet been able to get our PageRank above 3/10. We have thousands of unique pages and plenty of original contextual content, we avoid duplicate content best we can, follow google's best practices for site structure, deal with any issues that come up in webmaster tools, have schema.org markup, avoid link spamming, have inbound links from authority sites (though OSE doesn't show most of them for some reason), lots of social shares to our pages and the domain has been owned by us for 12 years. Any thoughts on why we would still have a PR of 3? Thanks for helping
Intermediate & Advanced SEO | | ShaqD0 -
Pages with rel "next"/"prev" still crawling as duplicate?
Howdy! I have a site that is crawling as "duplicate content pages" that is really just pagination. The rel next/prev is in place and done correctly but Roger Bot and Google are both showing duplicated content + duplicate page titles & meta's respectively. The only thing I can think of is we have a canonical pointing back at the URL you are on - we do not have a view all option right now and would not feel comfortable recommending it given the speed implications and size of their catalog. Any experience, recommendations here? Something to be worried about? /collections/all?page=15"/>
Intermediate & Advanced SEO | | paul-bold0 -
How hard would it be to take a well-linked site, completely change the subject matter & still retain link authority?
So, this would be taking a domain with a domain authority of 50 (200 root domains, 3500 total links) and, for fictitious example, going from a subject matter like "Online Deals" to "The History Of Dentistry"... just totally unrelated new subject for the old/re-purposed domain. The old content goes away entirely. The domain name itself is a super vague .com name and has no exact match to anything either way. I'm wondering, if the DNS changed to different servers, it went from 1000 pages to a blog, ownership/contacts stayed the same, the missing pages were 301'd to the homepage, how would that fare in Google for the new homepage focus and over what time frame? Assume the new terms are a reasonable match to the old domain authority and compete U.S.-wide... not local or international. Bonus points for answers from folks who have actually done this. Thanks... Darcy
Intermediate & Advanced SEO | | 945010 -
Dev Site Out of SERP But Still Indexed
One of our dev sites get indexed (live site robots.txt was moved to it, that has been corrected) 2-3 weeks ago. I immediately added it to our Webmaster Tools and used the Remove URL tool to get the whole thing out of the SERPs. A site:devurl search in Google now returns no results, but checking Index Status in WMT shows 2,889 pages of it still indexed. How can I get all instances of it completely removed from Google?
Intermediate & Advanced SEO | | Kingof50 -
Do 404 Pages from Broken Links Still Pass Link Equity?
Hi everyone, I've searched the Q&A section, and also Google, for about the past hour and couldn't find a clear answer on this. When inbound links point to a page that no longer exists, thus producing a 404 Error Page, is link equity/domain authority lost? We are migrating a large eCommerce website and have hundreds of pages with little to no traffic that have legacy 301 redirects pointing to their URLs. I'm trying to decide how necessary it is to keep these redirects. I'm not concerned about the page authority of the pages with little traffic...I'm concerned about overall domain authority of the site since that certainly plays a role in how the site ranks overall in Google (especially pages with no links pointing to them...perfect example is Amazon...thousands of pages with no external links that rank #1 in Google for their product name). Anyone have a clear answer? Thanks!
Intermediate & Advanced SEO | | M_D_Golden_Peak0 -
Is Video Sharing sites is still useful for SERP ?
Well I am not talking about the audience views, i am asking whether it is good for submitting videos to multiple video sites for backlinks and any sharp movements for the keywords. I seen most of the sites are nofollow which is not useful but for the link diversification is that something good ?
Intermediate & Advanced SEO | | chandubaba0 -
My site still out of rank
Hello, I am working on a site for past 3 months, here are the problems with this site, 1. It had a forum full of spam becuase initially captcha was not included, 10000 spam backlinks 2. Affiliate page was also hit by spam about 4000 spam backlinks which were either not existing or porn etc.... 3. Too many internal links which were indexed, these additional links were generated due to tags, ids, filters etc. Existing SEO team decided to remove the forum and after 30 days they blocked it in robots. But within 30 days the site moved from 3rd page to no where. Now after few days lator internal links are also cleaned by putting following in the robots, Dissallow: / *? Dissallow: / *id Dissallow: / *tag Links are now cleaning up, all the spam and bad links are now put into disavow file and sent to google via disavow tool. On daily bases good quality links are been produced such as through content, article submission, profile linking, Bookmarks etc. The site is still not any where on top 50 results. The impressions are decreasing, traffic also do not rise as much. How do you see all this situation. What do you suggest and how long do you think it will take to return to top 10 when good linking is being done and all preventive measures are being taken. I would appreciate any feedback on it. Thank you. Site URL: http://www.creativethemes.net keywords: magento themes, magento templates
Intermediate & Advanced SEO | | MozAddict0