Https & http
-
I have my website (HTTP://thespacecollective.com) marked on Google Webmaster Tools as being the primary domain, as opposed to https. But should all of my on page links be http?
For instance, if I click the Home button on my home page it will take the user to http, but if you type in the domain name in the address bar it will take you to https.
Could this be causing me problems for SEO?
-
Thank you!
-
Yes, you should have both active in Search Console, but set the HTTPS to the preferred.
-
You will continue to have both http and https variants active in Google Search Console (you should also add the non www variants and set www as your preferred version).
You do not set anything up within GSC to direct HTTP to HTTPS (to tell Google that you are changing protocols), this is all done via redirects as Logan suggests. Here's a great page which should help clarify this for you:
http://webmasters.stackexchange.com/questions/68435/moving-from-http-to-https-google-search-console
-
Thanks for the additional info but I think you missed my question. Please see the attached image.
I have HTTP and HTTPS set up on Google Search Console. Which one should I be using, or should both be active?
-
Yea, as the bots hit the URLs on your sitemap, it forces them to step through the redirect, which is what you want. They won't notice the new location if you don't point it out to them, and this is the most efficient way to do so.
*To be clear, since this gets confusing in, the URL of the location of your XML should be HTTPS://thespacecollective.com/sitemap.xml, but the URLs listed in it should be HTTP.
Also, add this line to your robots.txt file, as the first line or last line, doesn't really matter:
-
Thanks Logan. Now I have two sites set up in the Google Search Console, http and https. The http version has the sitemap and pretty much everything set up, should I just keep using this even though the site will now be https?
-
When you set up the 301 redirect rule that sends HTTP requests to HTTPS, Google will notice that. Leave your XML sitemap the way it is (with HTTP URL references) for 30 days. This will give them sufficient time to crawl your XML sitemap and learn your new protocol as they hit the redirects. Once most of your indexed pages have switched to HTTPS, you can update your XML to include the secure URLs.
-
Thank you for the links, I have read through each and have decided to change to HTTPS as you advise. I've done everything with the exception of informing Google that the new site is https as opposed to http. How do I make them aware?
I have set up http and https in Webmaster Tools, but how do I tell Google which one is relevant in order to stop any duplicate content issues?
-
From what I understand, you're already decided to split your traffic between HTTP and HTTPS. If this is correct, I would urge you to reconsider and redirect all traffic toward HTTPS versions as there are more issues to consider other than duplicate content, particularly as you are an e-commerce store. The latest (and future) versions of Chrome and Firefox will more clearly highlight unsecured connections. This is from Google's security blog: (https://security.googleblog.com/2016/09/moving-towards-more-secure-web.html?m=1)
"In following releases, we will continue to extend HTTP warnings, for example, by labelling HTTP pages as “not secure” in Incognito mode, where users may have higher expectations of privacy. Eventually, we plan to label all HTTP pages as non-secure, and change the HTTP security indicator to the red triangle that we use for broken HTTPS."
Chrome is the world's most popular browser, used by over 50% of all internet users. If your site is displaying a red triangle with the words 'Not Secure' next to it on ANY page on your site is going to turn visitors away. If over half you your visitors are receiving such a message the consequences will not be good.
Google are pushing users toward HTTPS (https://moz.com/blog/https-tops-30-how-google-is-winning-the-long-war) so I would suggest that it's a mis-step to swim against the tide.
There are also other minor benefits to serving all of your pages via HTTPS; it's a minor ranking signal and better support for browser compression, among others.
Here's another article that covers the recent changes.
https://www.searchenginejournal.com/google-is-requiring-https-for-secure-data-in-chrome/183756/
However you proceed, I hope this goes smoothly for you.
Good luck.
-
Perfect, thank you. I'm doing this as we speak!
-
Rolling back to HTTP for non-checkout pages is an option as well. The main point I was trying to make was to not have both versions of your URLs accessible/indexable.
-
Thanks for this Logan. Surely it makes more sense for me to simple change my website to HTTP and just keep Cart/Checkout, etc. as HTTPS? I see changing to HTTPS as a big risk and a lot of unnecessary work for very little benefit.
-
Hi,
Both versions HTTP and HTTPS of your site will render, that's a problem. Since you've got an SSL and it's been applied to the home page, you should make your entire site secure. Once you've done that, you'll want to apply a redirect rule that sends all HTTP requests to the HTTPS version. Because you're not currently doing that, you're running the risk of duplicate content issues. Once you've done that, yes, you should set the primary domain in Google Search Console (WMT) as HTTPS. There's a few other steps you'll want to take as well - Cryus Shepard wrote a great post detailing all necessary steps for secure migration, I highly recommend reading that.
Additionally, when people on your site are bouncing back and forth between HTTP and HTTPS, it's destroying your data integrity in Google Analytics. Going from a HTTP page to a HTTPS page breaks the session, and starts a new one that will be attributed to direct traffic. You can see how this would quickly become a nightmare for accurate analysis and measurement. If you follow the steps in Cyrus' post, your GA data should return to normal because users won't be going back and forth from secure to non-secure.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
GoogleBot still crawling HTTP/1.1 years after website moved to HTTP/2
Whole website moved to https://www. HTTP/2 version 3 years ago. When we review log files, it is clear that - for the home page - GoogleBot continues to only access via HTTP/1.1 protocol Robots file is correct (simply allowing all and referring to https://www. sitemap Sitemap is referencing https://www. pages including homepage Hosting provider has confirmed server is correctly configured to support HTTP/2 and provided evidence of accessing via HTTP/2 working 301 redirects set up for non-secure and non-www versions of website all to https://www. version Not using a CDN or proxy GSC reports home page as correctly indexed (with https://www. version canonicalised) but does still have the non-secure version of website as the referring page in the Discovery section. GSC also reports homepage as being crawled every day or so. Totally understand it can take time to update index, but we are at a complete loss to understand why GoogleBot continues to only go through HTTP/1.1 version not 2 Possibly related issue - and of course what is causing concern - is that new pages of site seem to index and perform well in SERP ... except home page. This never makes it to page 1 (other than for brand name) despite rating multiples higher in terms of content, speed etc than other pages which still get indexed in preference to home page. Any thoughts, further tests, ideas, direction or anything will be much appreciated!
Technical SEO | | AKCAC1 -
Desktop & Mobile XML Sitemap Submitted But Only Desktop Sitemap Indexed On Google Search Console
Hi! The Problem We have submitted to GSC a sitemap index. Within that index there are 4 XML Sitemaps. Including one for the desktop site and one for the mobile site. The desktop sitemap has 3300 URLs, of which Google has indexed (according to GSC) 3,000 (approx). The mobile sitemap has 1,000 URLs of which Google has indexed 74 of them. The pages are crawlable, the site structure is logical. And performing a Landing Page URL search (showing only Google/Organic source/medium) on Google Analytics I can see that hundreds of those mobile URLs are being landed on. A search on mobile for a longtail keyword from a (randomly selected) page shows a result in the SERPs for the mobile page that judging by GSC has not been indexed. Could this be because we have recently added rel=alternate tags on our desktop pages (and of course corresponding canonical ones on mobile). Would Google then 'not index' rel=alternate page versions? Thanks for any input on this one. PmHmG
Technical SEO | | AlisonMills0 -
Duplicated content & url's for e-commerce website
Hi, I have an e-commerce site where I sell greeting cards. Products are under different categories (birthday, Christmas etc) with subcategories (for Mother, for Sister etc) and same product can be under 3 or 6 subcategories, for example: url: .../greeting-cards/Christmas/product1/for-mother
Technical SEO | | jurginga
url:.../greeting-cards/Christmas/product1/for-sister
etc On the CMS I have one description record per each card (product1) with multiple subcategories attached that naturally creates URLs for subcategories. Moz system (and Google for sure) picks these urls (and content) as duplicated.
Any ideas how to solve this problem?
Thank you very much!0 -
HTTPS Migration & Preserving Link Equity
Hey All — I’m working with a site that is migrating to HTTPS and had a couple questions. I read Moz’s ‘SEO Tips & Tricks for HTTPS’ post but want some clarification on a couple items. Aside from using https canonicals... 1. What is the best way to preserve link equity from inbound links? Site wide 301 Redirect in .htacess? 2. What is the best way to redirect internal links from http to https? The site uses absolute internal links. THX!
Technical SEO | | JJLWeber0 -
How bad is it to have duplicate content across http:// and https:// versions of the site?
A lot of pages on our website are currently indexed on both their http:// and https:// URLs. I realise that this is a duplicate content problem, but how major an issue is this in practice? Also, am I right in saying that the best solution would be to use rel canonical tags to highlight the https pages as the canonical versions?
Technical SEO | | RG_SEO0 -
Proper method of consolidating https to http?
A client has an application area of the site (a directory) that has a form and needs to be secured with ssl. The vast majority of the site is static, and does not need to be secured. We have experienced situations where a visitor navigates the site as https which then throws security errors. We want to keep static visitors on http; (and crawlers) and only have visits to the secure area display as ssl. How is this best accomplished? Our developer wants to add a rule to the global configuration file in php that uses a 301 redirect to ensure static pages are accessed as http, and the secure directory is accessed as https. Is the the proper protocol? Are there any SEO considerations we should make? Thanks.
Technical SEO | | seagreen0 -
Disallowing https URLs
It there a problem disallowing all https URLs to be indexed in order to avoid duplication? This is the article recommending this practice - http://blog.leonardchallis.com/seo/serve-a-different-robots-txt-for-https/ Thanks!
Technical SEO | | theLotter0 -
Geotargeting a folder in GWT & IP targeting
I am curently managing a .com that targets Canada and we will soon be launching a .com/us/ that will target the US. Once we launch the /us/ folder, we want to display the /us/ content to any US IP. My concern is that Google will then only index the /us/ content, as their IP is in the US. So, if I set up .com and .com/us/ as two different sites in GWT, and geotarget each to the Country it is targeting, will this take care of the problem and ensure that Google indexes the .com for Canada, and the /us/ for the US? Is there any alternative method (that does not include using the .ca domain)? I am concerned that Google would not be able to see the .com content if we are redirecting all US traffic to .com/us/. Any examples of this online anywhere?
Technical SEO | | bheard0