SSL Cert error
-
Just just implemented SSL with a wild card cert and I got an email from google that my non-www cert is not valid.
Any ideas ?
SSL/TLS certificate does not include domain name https://electrictime.com/
To: Webmaster of https://electrictime.com/,
Google has detected that the current SSL/TLS certificate used on <a>https://electrictime.com/</a> does not include <a>https://electrictime.com/</a> domain name. This means that your website is not perceived as secure by some browsers. As a result, many web browsers will block users accessing your site by displaying a security warning message. This is done to protect users’ browsing behavior from being intercepted by a third party, which can happen on sites that are not secure.
-
I suggest to just do redirection from non www to www or http to https version. This link may be helpful for you.
-
How did you impliment it? Do you have access to Apache? if you do This is how I have about 20 of my websites SSL CRT/KEY files setup if you want something to check it with. IF you did it with php.ini probably will need to check the walkthrough with your webhost.
<virtualhost *:443=""></virtualhost>
Serveradmin some@email.com
ServerName website.com
ServerAlias www.website.com
DocumentRoot /var/www/html/website/
SSLEngine on
SSLCertificateFile "/var/key/website.com.crt"
SSLCertificateKeyFile "/var/key/website.com.key"
allow from all
Options Indexes FollowSymLinks MultiViews
Require all granted
AllowOverride All
Order allow,deny
-
Hi Thomas
If you registered the SSL for www then it won't necessarily cover the non www. You should make sure that the version you sign up for SSL with matches the configuration you are using. If you bought an SSL for www then use www.
I have this issue with one of our sites but it is only saying this in the Opera browser.
https://stackoverflow.com/questions/40309552/do-i-need-an-ssl-certificate-for-www-and-non-www
Regards
Nigel
-
Hi There,
Your website is accurately redirected to the https://www version, I won't worry about the error as non-www version is not relevant in your case. You may also read the following articles why this problem might have come:
https://www.clickssl.net/blog/do-i-need-different-ssl-certificates-for-www-non-www-domain
https://www.quora.com/How-can-an-HTTPS-certificate-work-for-both-the-www-and-non-www-domains
I hope this helps, let me know if you have further queries.
Best Regards,
Vijay
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Error 404 Search Console
Hi all, We have a number of 404 https status listed in Search Console even treated, not decrease. What happened: We launched a website with the urls www.meusite.com/url-abc. We launched these urls sitemap. Google has indexed. ... For some reason, the urls were changed four days later by some developer in my equipe. So I asked the redirection of URLs "old" already indexed to the new (of: / url-abc to / url-xyz) all correspondingly. I submit the sitemap with new urls. We fixed the internal links. And than marked as fixed in the Search Console. But it does not work! Has anyone had a similar experience? Thanks for any advice!
Intermediate & Advanced SEO | | mobic0 -
Rich snippets error?
hello everyone, I have this problem with the rich snippets: http://www.google.com/webmasters/tools/richsnippets?q=http%3A%2F%2Fwww.visalietuva.lt%2Fimone%2Ffcr-media-lietuva-uab The problem is that it says some kind of error. But I can't figure it out what it is. We implemented the same code on our other websites: http://www.imones.lt/fcr-media-lietuva-uab and http://www.1588.lt/imone/fcr-media-lietuva-uab . The snippets appear on Google and works perfectly.
Intermediate & Advanced SEO | | FCRMediaLietuva
The only site that has this problem is visalietuva.lt I attached the image to show what I mean. I really need tips for this one. gbozIrt.png0 -
Can an incorrect 301 redirect or .htaccess code cause 500 errors?
Google Webmaster Tools is showing the following message: _Googlebot couldn't access the contents of this URL because the server had an internal error when trying to process the request. These errors tend to be with the server itself, not with the request. _ Before I contact the person who manages the server and hosting (essentially asking if the error is on his end) is there a chance I could have created an issue with an incorrect 301 redirect or other code added to .htaccess incorrectly? Here is the 301 redirect code I am using in .htaccess: RewriteEngine On RewriteCond %{THE_REQUEST} ^[A-Z]{3,9}\ /([^/.]+/)*(index.html|default.asp)\ HTTP/ RewriteRule ^(([^/.]+/)*)(index|default) http://www.example.com/$1 [R=301,L] RewriteCond %{HTTP_HOST} !^(www.example.com)?$ [NC] RewriteRule (.*) http://www.example.com/$1 [R=301,L] Could adding the following code after that in the .htaccess potentially cause any issues? BEGIN EXPIRES <ifmodule mod_expires.c="">ExpiresActive On
Intermediate & Advanced SEO | | kimmiedawn
ExpiresDefault "access plus 10 days"
ExpiresByType text/css "access plus 1 week"
ExpiresByType text/plain "access plus 1 month"
ExpiresByType image/gif "access plus 1 month"
ExpiresByType image/png "access plus 1 month"
ExpiresByType image/jpeg "access plus 1 month"
ExpiresByType application/x-javascript "access plus 1 month"
ExpiresByType application/javascript "access plus 1 week"
ExpiresByType application/x-icon "access plus 1 year"</ifmodule> END EXPIRES (Edit) I'd like to add that there is a Wordpress blog on the site too at www.example.com/blog with the following code in it's .htaccess: BEGIN WordPress <ifmodule mod_rewrite.c="">RewriteEngine On
RewriteBase /blog/
RewriteRule ^index.php$ - [L]
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteRule . /blog/index.php [L]</ifmodule> END WordPress Thanks0 -
Best way to fix 404 crawl errors caused by Private blog posts in WordPress?
Going over Moz Crawl error report and WMT's Crawl errors for a new client site... I found 44 High Priority Crawl Errors = 404 Not Found I found that those 44 blog pages were set to Private Mode (WordPress theme), causing the 404 issue.
Intermediate & Advanced SEO | | SEOEND
I was reviewing the blog content for those 44 pages to see why those 2010 blog posts, were set to private mode. Well, I noticed that all those 44 blog posts were pretty much copied from other external blog posts. So i'm thinking previous agency placed those pages under private mode, to avoid getting hit for duplicate content issues. All other blog posts posted after 2011 looked like unique content, non scraped. So my question to all is: What is the best way to fix the issue caused by these 44 pages? A. Remove those 44 blog posts that used verbatim scraped content from other external blogs.
B. Update the content on each of those 44 blog posts, then set to Public mode, instead of Private.
C. ? (open to recommendations) I didn't find any external links pointing to any of those 44 blog pages, so I was considering in removing those blog posts. However not sure if that will affect site in anyway. Open to recommendations before making a decision...
Thanks0 -
Duplicate page content errors stemming from CMS
Hello! We've recently relaunched (and completely restructured) our website. All looks well except for some duplicate content issues. Our internal CMS (custom) adds a /content/ to each page. Our development team has also set-up URLs to work without /content/. Is there a way I can tell Google that these are the same pages. I looked into the parameters tool, but that seemed more in-line with ecommerce and the like. Am I missing anything else?
Intermediate & Advanced SEO | | taylor.craig0 -
Robot.txt error
I currently have this under my robot txt file: User-agent: *
Intermediate & Advanced SEO | | Rubix
Disallow: /authenticated/
Disallow: /css/
Disallow: /images/
Disallow: /js/
Disallow: /PayPal/
Disallow: /Reporting/
Disallow: /RegistrationComplete.aspx WebMatrix 2.0 On webmaster > Health Check > Blocked URL I copy and paste above code then click on Test, everything looks ok but then logout and log back in then I see below code under Blocked URL: User-agent: * Disallow: / WebMatrix 2.0 Currently, Google doesn't index my domain and i don't understand why this happening. Any ideas? Thanks Seda0 -
URL errors in Google Webmaster Tool
Hi Within Google Webmaster Tool 'Crawl errors' report by clicking 'Not found' it shows 404 errors its found. By clicking any column headings and it will reorder them. One column is 'Priority' - do you think Google is telling me its ranked the errors in priority of needing a fix? There is no reference to this in the Webmaster tool help. Many thanks Nigel
Intermediate & Advanced SEO | | Richard5551 -
Squarespace Errors
We have a website hosted by SquareSpace. We are happy with SS, but have done some crawl diagnostics and noticed several errors. These are primarily: Duplicate Page Title Duplicate Page Content Client Error (4xx) We dont really understand why these errors are taking place, and wonder if someone in the Seomoz forum has a firm understanding of SS who is able to assist us with this? rainforestcruises.com thanks.
Intermediate & Advanced SEO | | RainforestCruises0