Https & http
-
I have my website (HTTP://thespacecollective.com) marked on Google Webmaster Tools as being the primary domain, as opposed to https. But should all of my on page links be http?
For instance, if I click the Home button on my home page it will take the user to http, but if you type in the domain name in the address bar it will take you to https.
Could this be causing me problems for SEO?
-
Thank you!
-
Yes, you should have both active in Search Console, but set the HTTPS to the preferred.
-
You will continue to have both http and https variants active in Google Search Console (you should also add the non www variants and set www as your preferred version).
You do not set anything up within GSC to direct HTTP to HTTPS (to tell Google that you are changing protocols), this is all done via redirects as Logan suggests. Here's a great page which should help clarify this for you:
http://webmasters.stackexchange.com/questions/68435/moving-from-http-to-https-google-search-console
-
Thanks for the additional info but I think you missed my question. Please see the attached image.
I have HTTP and HTTPS set up on Google Search Console. Which one should I be using, or should both be active?
-
Yea, as the bots hit the URLs on your sitemap, it forces them to step through the redirect, which is what you want. They won't notice the new location if you don't point it out to them, and this is the most efficient way to do so.
*To be clear, since this gets confusing in, the URL of the location of your XML should be HTTPS://thespacecollective.com/sitemap.xml, but the URLs listed in it should be HTTP.
Also, add this line to your robots.txt file, as the first line or last line, doesn't really matter:
-
Thanks Logan. Now I have two sites set up in the Google Search Console, http and https. The http version has the sitemap and pretty much everything set up, should I just keep using this even though the site will now be https?
-
When you set up the 301 redirect rule that sends HTTP requests to HTTPS, Google will notice that. Leave your XML sitemap the way it is (with HTTP URL references) for 30 days. This will give them sufficient time to crawl your XML sitemap and learn your new protocol as they hit the redirects. Once most of your indexed pages have switched to HTTPS, you can update your XML to include the secure URLs.
-
Thank you for the links, I have read through each and have decided to change to HTTPS as you advise. I've done everything with the exception of informing Google that the new site is https as opposed to http. How do I make them aware?
I have set up http and https in Webmaster Tools, but how do I tell Google which one is relevant in order to stop any duplicate content issues?
-
From what I understand, you're already decided to split your traffic between HTTP and HTTPS. If this is correct, I would urge you to reconsider and redirect all traffic toward HTTPS versions as there are more issues to consider other than duplicate content, particularly as you are an e-commerce store. The latest (and future) versions of Chrome and Firefox will more clearly highlight unsecured connections. This is from Google's security blog: (https://security.googleblog.com/2016/09/moving-towards-more-secure-web.html?m=1)
"In following releases, we will continue to extend HTTP warnings, for example, by labelling HTTP pages as “not secure” in Incognito mode, where users may have higher expectations of privacy. Eventually, we plan to label all HTTP pages as non-secure, and change the HTTP security indicator to the red triangle that we use for broken HTTPS."
Chrome is the world's most popular browser, used by over 50% of all internet users. If your site is displaying a red triangle with the words 'Not Secure' next to it on ANY page on your site is going to turn visitors away. If over half you your visitors are receiving such a message the consequences will not be good.
Google are pushing users toward HTTPS (https://moz.com/blog/https-tops-30-how-google-is-winning-the-long-war) so I would suggest that it's a mis-step to swim against the tide.
There are also other minor benefits to serving all of your pages via HTTPS; it's a minor ranking signal and better support for browser compression, among others.
Here's another article that covers the recent changes.
https://www.searchenginejournal.com/google-is-requiring-https-for-secure-data-in-chrome/183756/
However you proceed, I hope this goes smoothly for you.
Good luck.
-
Perfect, thank you. I'm doing this as we speak!
-
Rolling back to HTTP for non-checkout pages is an option as well. The main point I was trying to make was to not have both versions of your URLs accessible/indexable.
-
Thanks for this Logan. Surely it makes more sense for me to simple change my website to HTTP and just keep Cart/Checkout, etc. as HTTPS? I see changing to HTTPS as a big risk and a lot of unnecessary work for very little benefit.
-
Hi,
Both versions HTTP and HTTPS of your site will render, that's a problem. Since you've got an SSL and it's been applied to the home page, you should make your entire site secure. Once you've done that, you'll want to apply a redirect rule that sends all HTTP requests to the HTTPS version. Because you're not currently doing that, you're running the risk of duplicate content issues. Once you've done that, yes, you should set the primary domain in Google Search Console (WMT) as HTTPS. There's a few other steps you'll want to take as well - Cryus Shepard wrote a great post detailing all necessary steps for secure migration, I highly recommend reading that.
Additionally, when people on your site are bouncing back and forth between HTTP and HTTPS, it's destroying your data integrity in Google Analytics. Going from a HTTP page to a HTTPS page breaks the session, and starts a new one that will be attributed to direct traffic. You can see how this would quickly become a nightmare for accurate analysis and measurement. If you follow the steps in Cyrus' post, your GA data should return to normal because users won't be going back and forth from secure to non-secure.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Our site recently switched from http to https. Do I still need to setup a redirect for the incoming links pointing to http?
Our site recently switched from http to https. If you type in the http://www.websitename.com then it will automatically go to https://www.websitename.com ... my question is... do I still need to create a redirect in the htaccess file to ensure we don't lose all the links currently pointing to the http version of the website?
Technical SEO | | ninel_P0 -
What is the better way to fix duplication https and http?
Hi All! I have a doubt about how to fix the duplication problem https @ http. What is the better way to fix it in your opionion/experience? Me, for instance, I have chosen to put "noindex, nofollow" into https version. Each page of my site has a https version so I put this metarobots into it....But I am not sure about what happens with all backlinks with "https" URLs I have, I've just checked I have some...What do you think about it? Thanks in advance for helping!
Technical SEO | | Red_educativa0 -
HTTP Vary:User-Agent Server or Page Level?
Looking for any insights regarding the usage of the Vary HTTP Header. Mainly around the idea that search engines will not like having a Vary HTTP Header on pages that don't have a mobile version, which means the header will be to be implemented on a page-by-page basis. Additionally, does anyone has experience with the usage of the Vary HTTP Header and CDNs like Akamai?Google still recommends using the header, even though it can present some challenges with CDNs. Thanks!
Technical SEO | | burnseo0 -
A few misc Webmaster tools questions & Robots.txt etc
Hi I have a few general misc questions re Robots.tx & GWT: 1) In the Robots.txt file what do the below lines block, internal search ? Disallow: /?
Technical SEO | | Dan-Lawrence
Disallow: /*? 2) Also the sites feeds are blocked in robots.txt, why would you want to block a sites feeds ? **3) **What's the best way to deal with the below: - old removed page thats returning a 500 response code ? - a soft 404 for an old removed page that has no current replacement old removed pages returning a 404 The old pages didn't have any authority or inbound links hence is it best/ok to simply create a url removal request in GWT ? Cheers Dan0 -
Https-pages still in the SERP's
Hi all, my problem is the following: our CMS (self-developed) produces https-versions of our "normal" web pages, which means duplicate content. Our it-department put the <noindex,nofollow>on the https pages, that was like 6 weeks ago.</noindex,nofollow> I check the number of indexed pages once a week and still see a lot of these https pages in the Google index. I know that I may hit different data center and that these numbers aren't 100% valid, but still... sometimes the number of indexed https even moves up. Any ideas/suggestions? Wait for a longer time? Or take the time and go to Webmaster Tools to kick them out of the index? Another question: for a nice query, one https page ranks No. 1. If I kick the page out of the index, do you think that the http page replaces the No. 1 position? Or will the ranking be lost? (sends some nice traffic :-))... thanx in advance 😉
Technical SEO | | accessKellyOCG0 -
Ajax #! URLs, Linking & Meta Refresh
Hi, We recently underwent a platform change and unfortunately our updated ecom site was coded using java script. The top navigation is uncrawlable, the pertinent product copy is undetectable and duplicated throughout the code, etc - it needs a lot of work to make it (even somewhat) seo-friendly. We're in the process of implementing ajax #! to our site and I've been tasked with creating a document of items that I will test to see if this solution will help our rankings, indexing, etc (on Google, I've read the issues w/ Bing). I have 2 questions: 1. Do I need to notify our content team who works on our linking strategy about the new urls? Would we use the #! url (for seo) or would we continue to use the clean url (without the #!) for inbound links? 2. When our site transferred over, we used meta refresh on all of the pages instead of 301s for some reason. Instead of going to a clean url, our meta refresh says this: . Would I update it to have the #! in the url? Should I try and clean up the meta refresh so it goes to an actual www. url and not this browsererrorview page? Or just push for the 301? I have read a ton of articles, including GWT docs, but I can't seem to find any solid information on these specific questions so any help I can get would be greatly appreciated. Thanks!
Technical SEO | | Improvements0 -
Htm vs. aspx page extensions & duplicate content
We have a client whose site is fairly new. There isn't much in the way of SEO results so far. In their content management system they have implemented friendly URLs and changed the extensions from aspx to htm. Now the htm pages are all indexed in Google but when I run a campaign report in SEOmoz it shows that all pages are duplicated with there being both htm and aspx pages for each page. Should we do 301 redirects from the aspx pages to the htm pages? Or would we be safe by removing the htm pages and letting Google reindex the site with the aspx page extensions? Does Google have any kind of preference as to what the page extensions are as long as the URLs include keywords?
Technical SEO | | IvieDigital0 -
Will using https across our entire site hurt our external backlinks?
Our site is secured throughout, so it loads sitewide as https. It is canonicalized properly - any attempt to load an existing page as http will force to https. My concern is with backlinks. We've put a lot of effort into social media, so we're getting some nice blog linkage. The problem is that the links are generally to http rather than https (understandable, since that's the default for most web users). The site still loads with no problem, but my concern is that since a redirect doesn't transfer all the link juice across, we're leaking some perfectly good link credit. From the standpoint of backlinkage, are we harming ourselves by making the whole site secure by default? The site presently isn't very big, but I'm looking at adding hundreds of new pages to the site, so if we're going to make the change, now is the time to do so. Let me know what you think!
Technical SEO | | ufmedia0