Https & http
-
I have my website (HTTP://thespacecollective.com) marked on Google Webmaster Tools as being the primary domain, as opposed to https. But should all of my on page links be http?
For instance, if I click the Home button on my home page it will take the user to http, but if you type in the domain name in the address bar it will take you to https.
Could this be causing me problems for SEO?
-
Thank you!
-
Yes, you should have both active in Search Console, but set the HTTPS to the preferred.
-
You will continue to have both http and https variants active in Google Search Console (you should also add the non www variants and set www as your preferred version).
You do not set anything up within GSC to direct HTTP to HTTPS (to tell Google that you are changing protocols), this is all done via redirects as Logan suggests. Here's a great page which should help clarify this for you:
http://webmasters.stackexchange.com/questions/68435/moving-from-http-to-https-google-search-console
-
Thanks for the additional info but I think you missed my question. Please see the attached image.
I have HTTP and HTTPS set up on Google Search Console. Which one should I be using, or should both be active?
-
Yea, as the bots hit the URLs on your sitemap, it forces them to step through the redirect, which is what you want. They won't notice the new location if you don't point it out to them, and this is the most efficient way to do so.
*To be clear, since this gets confusing in, the URL of the location of your XML should be HTTPS://thespacecollective.com/sitemap.xml, but the URLs listed in it should be HTTP.
Also, add this line to your robots.txt file, as the first line or last line, doesn't really matter:
-
Thanks Logan. Now I have two sites set up in the Google Search Console, http and https. The http version has the sitemap and pretty much everything set up, should I just keep using this even though the site will now be https?
-
When you set up the 301 redirect rule that sends HTTP requests to HTTPS, Google will notice that. Leave your XML sitemap the way it is (with HTTP URL references) for 30 days. This will give them sufficient time to crawl your XML sitemap and learn your new protocol as they hit the redirects. Once most of your indexed pages have switched to HTTPS, you can update your XML to include the secure URLs.
-
Thank you for the links, I have read through each and have decided to change to HTTPS as you advise. I've done everything with the exception of informing Google that the new site is https as opposed to http. How do I make them aware?
I have set up http and https in Webmaster Tools, but how do I tell Google which one is relevant in order to stop any duplicate content issues?
-
From what I understand, you're already decided to split your traffic between HTTP and HTTPS. If this is correct, I would urge you to reconsider and redirect all traffic toward HTTPS versions as there are more issues to consider other than duplicate content, particularly as you are an e-commerce store. The latest (and future) versions of Chrome and Firefox will more clearly highlight unsecured connections. This is from Google's security blog: (https://security.googleblog.com/2016/09/moving-towards-more-secure-web.html?m=1)
"In following releases, we will continue to extend HTTP warnings, for example, by labelling HTTP pages as “not secure” in Incognito mode, where users may have higher expectations of privacy. Eventually, we plan to label all HTTP pages as non-secure, and change the HTTP security indicator to the red triangle that we use for broken HTTPS."
Chrome is the world's most popular browser, used by over 50% of all internet users. If your site is displaying a red triangle with the words 'Not Secure' next to it on ANY page on your site is going to turn visitors away. If over half you your visitors are receiving such a message the consequences will not be good.
Google are pushing users toward HTTPS (https://moz.com/blog/https-tops-30-how-google-is-winning-the-long-war) so I would suggest that it's a mis-step to swim against the tide.
There are also other minor benefits to serving all of your pages via HTTPS; it's a minor ranking signal and better support for browser compression, among others.
Here's another article that covers the recent changes.
https://www.searchenginejournal.com/google-is-requiring-https-for-secure-data-in-chrome/183756/
However you proceed, I hope this goes smoothly for you.
Good luck.
-
Perfect, thank you. I'm doing this as we speak!
-
Rolling back to HTTP for non-checkout pages is an option as well. The main point I was trying to make was to not have both versions of your URLs accessible/indexable.
-
Thanks for this Logan. Surely it makes more sense for me to simple change my website to HTTP and just keep Cart/Checkout, etc. as HTTPS? I see changing to HTTPS as a big risk and a lot of unnecessary work for very little benefit.
-
Hi,
Both versions HTTP and HTTPS of your site will render, that's a problem. Since you've got an SSL and it's been applied to the home page, you should make your entire site secure. Once you've done that, you'll want to apply a redirect rule that sends all HTTP requests to the HTTPS version. Because you're not currently doing that, you're running the risk of duplicate content issues. Once you've done that, yes, you should set the primary domain in Google Search Console (WMT) as HTTPS. There's a few other steps you'll want to take as well - Cryus Shepard wrote a great post detailing all necessary steps for secure migration, I highly recommend reading that.
Additionally, when people on your site are bouncing back and forth between HTTP and HTTPS, it's destroying your data integrity in Google Analytics. Going from a HTTP page to a HTTPS page breaks the session, and starts a new one that will be attributed to direct traffic. You can see how this would quickly become a nightmare for accurate analysis and measurement. If you follow the steps in Cyrus' post, your GA data should return to normal because users won't be going back and forth from secure to non-secure.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Dealing with Expired & Reoccurring Content At Scale
Hello, I have a question concerning maintenance & pruning content with a large site that has a ton of pages that are either expired OR reoccurring. Firstly, there's ~ 12,000 pages on the site. They have large sections of the site that have individual landing pages for time-sensitive content, such as promotions and shows. They have TONS of shows every day, so the # of page to manage keeps exponentially increasing. Show URLs: I'm auditing the show URLs and looking at pages that have backlinks. With those, I am redirecting to the main show pages.
Technical SEO | | triveraseo
-However, there are significant # of show URLs that are from a few years ago (2012, 2013, 2014, 2015) that DON'T get traffic or have any backlinks (or ranking keywords). Can I delete these pages entirely from the site, or should I go through the process of 410-ing them (and then deleting? or ...?)Can you let 410's sit?)? They are in the XML sitemap right now, so they get crawled, but are essentially useless, and I want to cut off the dead weight, but I'm worried about deleting a large # of pages from the site at once. For show URLs that are still obsolete, but rank well in terms of kewyords and get some traffic...is there any recommended option? Should I bother adding them to a past shows archive section or not since they are bringing in a LITTLE traffic? Or ax them since it's such a small amount of traffic compared to what they get from the main pages. There are URLs that are orphaned and obsolete right now, but will reoccur. For instance, when an artist performs, they get their own landing page, they may acquire some backlinks and rank, but then that artist doesn't come back for a few months. The page just sits there, orphaned and in the XML sitemap. However, regardless of back-links/keywords, the page will come back eventually. Is there any recommended way to maintain this kind of situation? Again, there are a LOT of URLs in this same boat. Promotional URLs: I'm going about the same process for promotions and thankfully, the scale of hte issue is much less. However, same question as above...they have some promotional URLs, like NYE Special Menu landing pages or Lent-Specials, etc, for each of their restaurants. These pages are only valid for a short amount of time each year, and otherwise, are obsolete. I want to reuse the pages each year, though, but don't want them to just sit there in the XML sitemap. Is there ever an instance where I might want to 302 redirect them, and then remove the 302 for the short amount of time they are valid? I'm not AS concerned about the recycled promotional URLs. There are much fewer URLs in this category. However, as you can probably tell, this large site has this problem of reoccurring content throughout, and I'd like to get a plan in place to clean it up and then create rules to maintain. Promotional URLs that reoccur are smaller, so if they are orphaned, not the end of the world, but there are thousands of show URLs with this issue, so I really need to determine the best play here. Any help is MUCH appreciated!0 -
GWT Fetch & Render displays desktop version of site as mobile
Hi team, I noticed that when I request a desktop rendering in GWT using fetch and render, pages render as the mobile version. Screenshot attached. It's related to the VHS units in our CSS (as far as I'm aware). Does anyone know what the implications of this may be? Does it mean googlebot can only see the mobile version of our website? Any help is appreciated. Jake jgScJ
Technical SEO | | Jacobsheehan0 -
Http -> https redirections / 301 the right way
Dear mozers, Thank you for your time reading the message and wanting to help! So, we have moved our WordPress to https and redirected all the content successfully via htaccess file. We used a simple 301 redirect plugin, which we are using to redirect old URLs to the new ones. The problem today is, the redirections in the plugin are not working for http version. Here is an example: htaccess redirect: http --> https Plugin redirect domain.com/old --> domain.com/new but, the url http://domain.com/old is not redirecting to https://domain.com/new while https://domain.com/old does redirects to https://domain.com/new What can you suggest as a solution? Thank you in advance! P.S. I don't think having 2 redirects for each version of the URL is the smartest solution Best wishes, Dusan
Technical SEO | | Chemometec0 -
Sudden decrease in indexed AMP pages after 8/1/16 update
After the AMP update on 8/1/16, the number of AMP pages indexed suddenly dropped by about 50% and it's crushing our search traffic- I haven't been able to find any documentation on any changes to look out for and why we are getting a penalty- any advice or something I should look out for?
Technical SEO | | nystromandy0 -
Google how deal with licensed content when this placed on vendor & client's website too. Will Google penalize the client's site for this ?
One of my client bought licensed content from top vendor of Health Industry. This same content is on the vendor's website & my client's site also but on my site there is a link back to vendor is placed which clearly tells to anyone that this is a licensed content & we bought from this vendor. My client bought paid top quality content for best source of industry but at this same this is placed on vendor's website also. Will Google penalize my client's website for this ? Niche is HEALTH
Technical SEO | | sourabhrana1 -
Mobile & desktop pages
I have a mobile site (m.example.com) and a desktop site (example.com). I want search engines to know that for every desktop page there is a mobile equivalent. To do this I insert a rel=alternate on the desktop pages to the mobile equivalent. On the mobile pages I insert a rel=canonical to it's equivalent desktop page. So far so good BUT: Almost every desktop page has 4 or 5 copies (duplicate content). I get rid of this issue by using the rel=canonical to the source page. Still no problem here. But what happens if I insert a rel=alternate to the mobile equivalent on every copy of the source page? I know it sounds stupid but the system doesn't allow me to insert a rel=alternate on just one page. It's all or nothing! My question: Does Google ignore the rel=alternate on the duplicate pages but keeps understanding the link between the desktop source page & mobile page ? Or should I avoid this scenario? Many Thanks Pieter
Technical SEO | | Humix0 -
Author & Video Markup on the Same Page
I just have a quick question about using schema.org markup. Is there any situation where you'd want to include both author & video markup on the same page?
Technical SEO | | justinnerd0 -
Https Version of Homepage in SERPS
The https version of our homepage appears in Google's SERPs. We have rel canonical on the page pointing to the http version. We have a redirect in our htaccess that sends https to http. I thought this was just a fluke and it would be fixed by the next crawl, but it's been like this for a few weeks now. Not only that, but we're losing rank a bit and I'm afraid there's a correlation. Has this ever happened to anyone?
Technical SEO | | UnderRugSwept0