Http and https issue in Google SERP
-
Hi,
I've noticed that Google indexing some of my pages as regular http, like this:
http://www.example.com/accounts/
and some pages are being indexed as https, like this:
https://www.example.com/platforms/
When I've performed site audit check in various SEO tools I got something around +450 pages duplicated and showing me pairs of the same URL pages, one time with http and one time with https.
In our site there is the possibility for people to register and and open an account, later on to login to our website with their login details.
In our company I'm not the one that is responsible for the site's maintenance and I would like to know if this is an issue, and if this is an issue - to know what causing it and how to fix it so I'll be able to forward the solution to the person in charge.
Additionally I would like to know in general, what is the real purpose of https vs. http and to know what is the preferred method that our website should use. Currently when URLs are typed manually to the address bar, all the URLs are loading fine - with or without https written at the start of each URL.
I'm not allowed to expose our site's name, this is why I wrote example.com instead, I hope you can understand that.
Thank you so much for your help and I'm looking forward reading your answers.
-
Hello Joni,
One of the solutions from the thread Matt linked to earlier is very interesting. I've never tried it but will definitely give it a shot soon. Read about it here. In short, you would add a line to the htaccess file that would instruct user agents trying to access a secure page on the site that they are disallowed from visiting any page.
Since I've never tried that solutions, and so far haven't talked to anyone who has, my recommendation is going to be the simplest one I know of:
Make sure your rel canonical tag is using the http version of the URL. This should not change when viewing the https version - it should have an http rel canonical tag. That should solve the problem without having to redirect users to a non-secure page, and without having to edit your htaccess file.
-
Hi Joni - have a look at this previous thread it will help you with this issue and how to solve the duplicate content that is being served.
http://moz.com/community/q/duplicate-content-and-http-and-https
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to change 302 redirect from http to https
Hi gang. Our site currently has a 302 redirect from the HTTP version of the homepage to the HTTPS version of the homepage. I understand this really should be changed to a 301 redirect but I'm having a little trouble figuring out exactly how this should be done. Some places on the internet are telling me I can edit our htaccess file to specify the type of redirect, however our htaccess file seems to be missing some of the information in theirs. Can anyone tell me what needs to be changed in the htaccess file - or if there's a simpler way to change the 302 to a 301? Many thanks 🙂 htaccess: BEGIN WordPress RewriteEngine On RewriteBase / RewriteRule ^index.php$ - [L] RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_FILENAME} !-d RewriteRule . /index.php [L] END WordPress EXPIRES CACHING ExpiresActive On ExpiresByType image/jpg "access plus 6 months" ExpiresByType image/jpeg "access plus 6 months" ExpiresByType image/gif "access plus 6 months" ExpiresByType image/png "access plus 6 months" ExpiresByType text/css "access plus 10 days" ExpiresByType application/pdf "access plus 10 days" ExpiresByType application/x-shockwave-flash "access plus 10 days" ExpiresByType image/x-icon "access plus 6 months" ExpiresDefault "access plus 2 days" EXPIRES CACHING
Technical SEO | | davedon0 -
Switching from HTTP to HTTPS and google webmaster
HI, I've recently moved one of my sites www.thegoldregister.co.uk to https. I'm using wordpress and put in the permanent 301 redirect for all pages to false https for all pages in the htaaccess file. I've updated the settings in google analytics to https for the original site. All seems to be working well. Regarding the google webmaster tools and what needs to be done. I'm very confused by the google documentation on this subject around https. Does all my crawl data and indexing from http site still stand and be inherited by the https version because of the redirects in place. I'm really worried I will lose all of this indexing data, I looked at the "change of address" in the settings of webmaster, but this seems to refer to changing the actual domain name rather than the protocol which i haven't at all. I've also tried adding the https version to the console as well, but the https version is showing a severe warning "is robots.txt blocking some important pages". I don't understand this error as it's the same version and file as the http site being generated by all in one seo pack for wordpress (see below at bottom). The warning is against line 5 saying it will ignore it. What i don't understand is i don't get this error in the webmaster console with the http version which is the same file?? Any help and advice would be much appreciated. Kind regards Steve User-agent: *
Technical SEO | | lqz
Disallow: /wp-admin/
Disallow: /wp-includes/
Disallow: /xmlrpc.php
Crawl-delay: 10 ceLAHIv.jpg0 -
Google Sitelinks
Hello, Good afternoon. I am having a site issue with Sitelinks. For some reason when I search Google for the brand I represent "California Olive Ranch" Sitelinks are not being generated. When I search for "Cal Olive Ranch" our site links are being generated. Our domain is Californiaoliveranch.com. Is there a way to tell Google to to change the site links to match our domain and brand name? Is this something that can be done in Google Webmasters? Thank you very much for your help. Adam P
Technical SEO | | apost40 -
Google not showing my website ?
The website is medicare.md. if you search for term "medicare doctors PG county maryland" it is #1 in bing and yahoo but not even showing on google.com first TEN pages, although not banned. Interestingly if you do that search on google.co.pk it is #4. Quite Puzzuling !! Would appreciate any help or advice . Sherif Hassan
Technical SEO | | sherohass0 -
Google Indexing
Hi Everybody, I am having kind of an issue when it comes to the results Google is showing on my site. I have a multilingual site, which is main language is Catalan. But of course if I am looking results in Spanish (google.es) or in English (google.com) I want Google to show the results with the proper URL, title and descriptions. My brand is "Vallnord" so if you type this in Google you will be displayed the result in Catalan (Which is not optimized at all yet) but if you search "vallnord.com/es" only then you will be displayed the result in Spanish What do I have to do in order for Google to read this the way I want? Regards, Guido.
Technical SEO | | SilbertAd0 -
Duplicate content and http and https
Within my Moz crawl report, I have a ton of duplicate content caused by identical pages due to identical pages of http and https URL's. For example: http://www.bigcompany.com/accomodations https://www.bigcompany.com/accomodations The strange thing is that 99% of these URL's are not sensitive in nature and do not require any security features. No credit card information, booking, or carts. The web developer cannot explain where these extra URL's came from or provide any further information. Advice or suggestions are welcome! How do I solve this issue? THANKS MOZZERS
Technical SEO | | hawkvt10 -
Google Off/On Tags
I came across this article about telling google not to crawl a portion of a webpage, but I never hear anyone in the SEO community talk about them. http://perishablepress.com/press/2009/08/23/tell-google-to-not-index-certain-parts-of-your-page/ Does anyone use these and find them to be effective? If not, how do you suggest noindexing/canonicalizing a portion of a page to avoid duplicate content that shows up on multiple pages?
Technical SEO | | Hakkasan1 -
How to disallow google and roger?
Hey Guys and girls, i have a question, i want to disallow all robots from accessing a certain root link: Get rid of bots User-agent: * Disallow: /index.php?_a=login&redir=/index.php?_a=tellafriend%26productId=* Will this make the bots not to access any web link that has the prefix you see before the asterisk? And at least google and roger will get away by reading "user-agent: *"? I know this isn't the standard proceedure but if it works for google and seomoz bot we are good.
Technical SEO | | iFix0