Proper method of consolidating https to http?
-
A client has an application area of the site (a directory) that has a form and needs to be secured with ssl. The vast majority of the site is static, and does not need to be secured. We have experienced situations where a visitor navigates the site as https which then throws security errors. We want to keep static visitors on http; (and crawlers) and only have visits to the secure area display as ssl. How is this best accomplished?
Our developer wants to add a rule to the global configuration file in php that uses a 301 redirect to ensure static pages are accessed as http, and the secure directory is accessed as https. Is the the proper protocol? Are there any SEO considerations we should make?
Thanks.
-
Hi there,
I would agree with your developer in using 301 redirects to ensure all static pages resolve only to the HTTP version while the secure pages resolve only to HTTPS.
As for SEO, the search engines should follow these 301 redirects just fine, but it might also be a good idea to designate canonical URLs to tell the search engines only to index non-HTTPS pages just to be safe. The PHP code below automatically detects which version of the page is being accessed and automatically inserts a canonical tag to tell the search engines to only index the non-HTTPS versions.
$currenturl= $_SERVER['HTTP_HOST'].$_SERVER['REQUEST_URI'];
//Check if it is using the secure https port which is 443
if ($_SERVER["SERVER_PORT"] == “443″) {//connected to secure port, formulate the http canonical version
$canonicalversion=”http://”.$currenturl;//echo the canonical version to the HTML as link rel canonical tag
echo ‘’;
}
?>
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Googlebot crawl error Javascript method is not defined
Hi All, I have this problem, that has been a pain in the ****. I get tons of crawl errors from "Googlebot" saying a specific Javascript method does not exist in my logs. I then go to the affected page and test in a web browser and the page works without any Javascript errors. Can some help with resolving this issue? Thanks in advance.
Technical SEO | | FreddyKgapza0 -
Migrating Http Site to Https Version
Hello, This coming weekend we will be changing our http sites to https versions. I have a very quick question regarding Google Search Console. Because the migration is happening over a weekend, we want to get as much as possible setup beforehand. Is there any risk to adding the new properties to the search console without the sites being live yet? I want to deliver the Search Console verify files to our IT team in advance for them to add to the site, and then once I get the okay that the migration went successfully, I would go into the Search Console and click on the Verify button to get the sites verified and of course, then fetch as Google to help speed up indexing a bit and ensure there are no errors. Any insight on this would be greatly appreciated! Amiee
Technical SEO | | Amiee0 -
Google Search Console Site Map Anomalies (HTTP vs HTTPS)
Hi I've just done my usual Monday morning review of clients Google Search Console (previously Webmaster Tools) dashboard and disturbed to see that for 1 client the Site Map section is reporting 95 pages submitted yet only 2 indexed (last time i looked last week it was reporting an expected level of indexed pages) here. It says the sitemap was submitted on the 10th March and processed yesterday. However in the 'Index Status' its showing a graph of growing indexed pages up to & including yesterday where they numbered 112 (so looks like all pages are indexed after all). Also the 'Crawl Stats' section is showing 186 pages crawled on the 26th. Then its listing sub site-maps all of which are non HTTPS (http) which seems very strange since the site is HTTPS and has been for a few months now and the main sitemap index url is an HTTPS: https://www.domain.com/sitemap_index.xml The sub sitemaps are:http://www.domain.com/marketing-sitemap.xmlhttp://www.domain.com/page-sitemap.xmlhttp://www.domain.com/post-sitemap.xmlThere are no 'Sitemap Errors' reported but there are 'Index Error' warnings for the above post-sitemap, copied below:_"When we tested a sample of the URLs from your Sitemap, we found that some of the URLs were unreachable. Please check your webserver for possible misconfiguration, as these errors may be caused by a server error (such as a 5xx error) or a network error between Googlebot and your server. All reachable URLs will still be submitted." _
Technical SEO | | Dan-Lawrence
Also for the below site map URL's: "Some URLs listed in this Sitemap have a high response time. This may indicate a problem with your server or with the content of the page" for:http://domain.com/en/post-sitemap.xmlANDhttps://www.domain.com/page-sitemap.xmlAND https://www.domain.com/post-sitemap.xmlI take it from all the above that the HTTPS sitemap is mainly fine and despite the reported 0 pages indexed in GSC sitemap section that they are in fact indexed as per the main 'Index Status' graph and that somehow some HTTP sitemap elements have been accidentally attached to the main HTTPS sitemap and the are causing these problems.What's best way forward to clean up this mess ? Resubmitting the HTTPS site map sounds like right option but seeing as the master url indexed is an https url cant see it making any difference until the http aspects are deleted/removed but how do you do that or even check that's what's needed ? Or should Google just sort this out eventually ? I see the graph in 'Crawl > Sitemaps > WebPages' is showing a consistent blue line of submitted pages but the red line of indexed pages drops to 0 for 3 - 5 days every 5 days or so. So fully indexed pages being reported for 5 day stretches then zero for a few days then indexed for another 5 days and so on ! ? Many ThanksDan0 -
Http to Https Backlink Value
We updated our website from http to https and I wanted to know how backlinks were affected by this. The new site redirects all old http links to the https home page. How does this affect more specific backlinks like http://www.mysite.com/about ? The old http://www.mysite.com/about is now being directed to https://www.mysite.com . Do I need to set up redirects to pass value to my new pages?
Technical SEO | | nat88han0 -
Https - should I do change of address on WMT
We have added a SSL cert to our site - Should I submit change of address on WMT and submit a new sitemap from the http?
Technical SEO | | webguru20140 -
Proper structure for site with multiple catagories of same products
Hi, we have products (trophies and awards) that can be catagorized in many ways. Using Award Medals as an example: - Medals by type: 1 1/2", 2", etc. - Medals by sport Baseball, Basketball, Cheer - Medals by Style Color, Gold, Silver, Bronze Right now, we have an Award Medals section off of our home page. The section has a decent page rank, but should be much better (I think). My guess is that we are loosing page range since we have separate sections with the groups above as we want our customers to be able to find the medals easily. Unfortunately, when we setup our site 10 years ago, we organized by type and this is what is hanging off the home page. The other groupings we added more recently. I have attached a snap shot of what the sections look like. We would like customers to find an individual medal when they do a Google search. For example a search for Baseball Medals. In Goggle, they likely would not search for 1 1/2" medals. My question is this: Can we keep the same structure we have today (to enable customer flexibility) but improve page rank and also have the sections like basball medals rank well? I have thought about using canonical tags, but the pages are not the same - in one case it is all baseball medals, in another it is all 1 1/2" medals, etc. Thanks for your help!!
Technical SEO | | trophycentraltrophiesandawards0 -
Proper way to 404 a page on an Ecommerce Website
Hello. I am working on a website that has over 15000 products. When one of these is no longer available - like it's discontinued or something - the page it's on 302s to a 404 page. Example - www.greatdomain.com/awesome-widget Awesome widget is no longer available www. greatdomain.com/awesome-widget 302s to -www.greatdomain.com/404 page. For the most part, these are not worthy of 301s because of lack of page rank/suitable LPs, but is this the correct way to handle them for search engines? I've seen varying opinions. Thanks!
Technical SEO | | Blenny0 -
How do you disallow HTTPS?
I currently have a site (startuploans.org) that runs everything as http, recently we decided to start an online application to process loan apps. Now, for one certain section we configured ssl to work (https://www.startuploans.org/secure/). If I go to the HTTPS url for any of my other pages they show up...I was going to just 301 everything from https but because it is in a subdirectiory I can't... Also, canonical URL's won't work either because it's a totally different system and the pages are generated in an odd manor. It's really just 1 page that needs to be disallowed.. Is there any way to disallow all HTTPS requests from robots.txt while keeping all the HTTP requests working as normal?
Technical SEO | | WebsiteConsultants0