Https & http
-
I have my website (HTTP://thespacecollective.com) marked on Google Webmaster Tools as being the primary domain, as opposed to https. But should all of my on page links be http?
For instance, if I click the Home button on my home page it will take the user to http, but if you type in the domain name in the address bar it will take you to https.
Could this be causing me problems for SEO?
-
Thank you!
-
Yes, you should have both active in Search Console, but set the HTTPS to the preferred.
-
You will continue to have both http and https variants active in Google Search Console (you should also add the non www variants and set www as your preferred version).
You do not set anything up within GSC to direct HTTP to HTTPS (to tell Google that you are changing protocols), this is all done via redirects as Logan suggests. Here's a great page which should help clarify this for you:
http://webmasters.stackexchange.com/questions/68435/moving-from-http-to-https-google-search-console
-
Thanks for the additional info but I think you missed my question. Please see the attached image.
I have HTTP and HTTPS set up on Google Search Console. Which one should I be using, or should both be active?
-
Yea, as the bots hit the URLs on your sitemap, it forces them to step through the redirect, which is what you want. They won't notice the new location if you don't point it out to them, and this is the most efficient way to do so.
*To be clear, since this gets confusing in, the URL of the location of your XML should be HTTPS://thespacecollective.com/sitemap.xml, but the URLs listed in it should be HTTP.
Also, add this line to your robots.txt file, as the first line or last line, doesn't really matter:
-
Thanks Logan. Now I have two sites set up in the Google Search Console, http and https. The http version has the sitemap and pretty much everything set up, should I just keep using this even though the site will now be https?
-
When you set up the 301 redirect rule that sends HTTP requests to HTTPS, Google will notice that. Leave your XML sitemap the way it is (with HTTP URL references) for 30 days. This will give them sufficient time to crawl your XML sitemap and learn your new protocol as they hit the redirects. Once most of your indexed pages have switched to HTTPS, you can update your XML to include the secure URLs.
-
Thank you for the links, I have read through each and have decided to change to HTTPS as you advise. I've done everything with the exception of informing Google that the new site is https as opposed to http. How do I make them aware?
I have set up http and https in Webmaster Tools, but how do I tell Google which one is relevant in order to stop any duplicate content issues?
-
From what I understand, you're already decided to split your traffic between HTTP and HTTPS. If this is correct, I would urge you to reconsider and redirect all traffic toward HTTPS versions as there are more issues to consider other than duplicate content, particularly as you are an e-commerce store. The latest (and future) versions of Chrome and Firefox will more clearly highlight unsecured connections. This is from Google's security blog: (https://security.googleblog.com/2016/09/moving-towards-more-secure-web.html?m=1)
"In following releases, we will continue to extend HTTP warnings, for example, by labelling HTTP pages as “not secure” in Incognito mode, where users may have higher expectations of privacy. Eventually, we plan to label all HTTP pages as non-secure, and change the HTTP security indicator to the red triangle that we use for broken HTTPS."
Chrome is the world's most popular browser, used by over 50% of all internet users. If your site is displaying a red triangle with the words 'Not Secure' next to it on ANY page on your site is going to turn visitors away. If over half you your visitors are receiving such a message the consequences will not be good.
Google are pushing users toward HTTPS (https://moz.com/blog/https-tops-30-how-google-is-winning-the-long-war) so I would suggest that it's a mis-step to swim against the tide.
There are also other minor benefits to serving all of your pages via HTTPS; it's a minor ranking signal and better support for browser compression, among others.
Here's another article that covers the recent changes.
https://www.searchenginejournal.com/google-is-requiring-https-for-secure-data-in-chrome/183756/
However you proceed, I hope this goes smoothly for you.
Good luck.
-
Perfect, thank you. I'm doing this as we speak!
-
Rolling back to HTTP for non-checkout pages is an option as well. The main point I was trying to make was to not have both versions of your URLs accessible/indexable.
-
Thanks for this Logan. Surely it makes more sense for me to simple change my website to HTTP and just keep Cart/Checkout, etc. as HTTPS? I see changing to HTTPS as a big risk and a lot of unnecessary work for very little benefit.
-
Hi,
Both versions HTTP and HTTPS of your site will render, that's a problem. Since you've got an SSL and it's been applied to the home page, you should make your entire site secure. Once you've done that, you'll want to apply a redirect rule that sends all HTTP requests to the HTTPS version. Because you're not currently doing that, you're running the risk of duplicate content issues. Once you've done that, yes, you should set the primary domain in Google Search Console (WMT) as HTTPS. There's a few other steps you'll want to take as well - Cryus Shepard wrote a great post detailing all necessary steps for secure migration, I highly recommend reading that.
Additionally, when people on your site are bouncing back and forth between HTTP and HTTPS, it's destroying your data integrity in Google Analytics. Going from a HTTP page to a HTTPS page breaks the session, and starts a new one that will be attributed to direct traffic. You can see how this would quickly become a nightmare for accurate analysis and measurement. If you follow the steps in Cyrus' post, your GA data should return to normal because users won't be going back and forth from secure to non-secure.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Indexing Desktop & Mobile Versions
We have a relatively new site and I have noticed recently that Google seems to be indexing both the mobile and the desktop version of our site. There are some queries where the mobile version will show up and sometimes both mobile and desktop show up. This can't be good. I would imagine that what is supposed to happen is that the desktop version is the one that should be indexed (always) and browser detection will load the mobile version where appropriate once the user is on the site. Do you have any advice on what we should do to solve this problem as we are a bit stuck?
Technical SEO | | simonukss0 -
What to do with old website still online & duplicate content
I launched a new wordpress site at www.cheaptubes.com in Sept. I haven't taken the old one down yet, it is still at http://65.61.43.25/ The reason I left it up is I wanted to make sure everything was properly redirected 1st. Some pages and images are still ranking but most point to the new site. When I search for carbon nanotubes pricelist and look in images I see some of our images on the old site are still ranking there https://www.google.com/imgres?imgurl=http://65.61.43.25/images/single-walled-nanotubes.1.gif&imgrefurl=http://65.61.43.25/ohfunctionalizedcnts.htm&h=359&w=451&tbnid=HKlL84A_9X0jGM:&docid=N2wdCg7rSQBsjM&ei=-A2qVqThL4WxeKCyjdAM&tbm=isch&ved=0ahUKEwikvcWdxczKAhWFGB4KHSBZA8oQMwhJKCIwIg I guess I can put WP on the old server and do some 301s from there but I'm not sure if that is best or if I should just kill it off entirely? My rankings took a hit on Nov 15th and business has been bad ever since so I'm trying to figure this out quickly. Moz.com and onpage.org both say my site has duplicate content on several pages. I've looked at the content and it isn't duplicate. How can I figure this out? Google likely see's it the same way. These aren't duplicate pages, they are different products. I even searched my product pages to make sure I didn't have 2 of each in there and I don't. With Moz its mostly product tags it sees as duplicate but the products are completely different
Technical SEO | | cheaptubes0 -
URL Structure & SEO - Should we be using sub-folders?
Hi all, As part of our content marketing efforts we have run a number of initiatives in the past and created pages on the website to go along with them (also where the links for these particular projects point to). However, the URL structure isn't actually a reflection of where the pages sit on the site. Unfortunately I'm unable to provide a URL for reasons I won't bore you with, but here's an example: We recently ran a competition that was very successful in generating links. The URL for this is www.domain.co.uk/competition. However, the page actually sits within the About Us section - which is where all of our news and content marketing pages go - and uses a URL override. How much of an issue is this in regards to A) Our SEO in general?; and B) Ensuring we receive as much equity from the links we earn as possible? A brief explanation of what URL overrides actually are would also be useful! (We have a digital marketing agency who handle most of our SEO) Thanks in advance guys! John
Technical SEO | | NAHL-14300 -
Fetching & Rendering a non ranking page in GWT to look for issues
Hi I have a clients nicely optimised webpage not ranking for its target keyword so just did a fetch & render in GWT to look for probs and could only do a partial fetch with the below robots.text related messages: Googlebot couldn't get all resources for this page Some boiler plate js plugins not found & some js comments reply blocked by robots (file below): User-agent: *
Technical SEO | | Dan-Lawrence
Disallow: /wp-admin/
Disallow: /wp-includes/ As far as i understand it the above is how it should be but just posting here to ask if anyone can confirm whether this could be causing any prrobs or not so i can rule it out or not. Pages targeting other more competitive keywords are ranking well and are almost identically optimised so cant think why this one is not ranking. Does fetch and render get Google to re-crawl the page ? so if i do this then press submit to index should know within a few days if still problem or not ? All Best Dan0 -
What are the SEO strengths & weaknesses of Magnolia CMS?
We are considering upgrading our Web eCommerce platform. Our current provider has just implemented Magnolia CMS into their Web store package. Do any of you have experience with this CMS and can you share your experiences and thoughts on whether or not it has any implications for SEO? Thanks!
Technical SEO | | danatanseo0 -
Staging & Development areas should be not indexable (i.e. no followed/no index in meta robots etc)
Hi I take it if theres a staging or development area on a subdomain for a site, who's content is hence usually duplicate then this should not be indexable i.e. (no-indexed & nofollowed in metarobots) ? In order to prevent dupe content probs as well as non project related people seeing work in progress or finding accidentally in search engine listings ? Also if theres no such info in meta robots is there any other way it may have been made non-indexable, or at least dupe content prob removed by canonicalising the page to the equivalent page on the live site ? In the case in question i am finding it listed in serps when i search for the staging/dev area url, so i presume this needs urgent attention ? Cheers Dan
Technical SEO | | Dan-Lawrence0 -
Duplicate Page Title & Content Penalty On Website Tonight Platform
I built my primary website on Website Tonight (WT) five years ago when I was a net newbie and I'm presently new to seomoz. The initial crawl indicated a problem with duplicate page title and duplicate content with my website home page in WT. It turns out that the WT platform makes you assign a file name to your homepage i.e: www.business.com/homepage.html that differs from the www.business.com that you want as your homepage url. Apparently the search engines are recognizing these identical pages as separate and duplicate. I know that the standard answer would be to just do a 301 redirect from the long file name to the short file name - end of story. But WT does not allow you to do 301 redirects and they also do not give you the ability to go into the htaccess file to fix this yourself manually. I spoke to the folks at WT tonight and they claim that they automatically do 301 redirects on the platform. But if this true then why am I getting the error message in seomoz? Does anyone know if this is a problem? If so, does anyone here have a fix? Thanks in advance. Sincerely - Bill in Denver
Technical SEO | | anxietycoach0 -
Microsite & Ducplicate Content Concern
I have a client that wants to put up a micro-site. It's not really even a niche micro-site, it's his whole site less a category and a few other pages. He is a plastic surgeon that offers cosmetic surgery services for the Face, Breast, and Body at his private practice in City A. He has partnered with another surgeon in City B who's surgical services are limited to only the Face. City B is nearby, but not so close that they consider themselves competitors for Facial surgery. The doctors agreement is that my client will perform only Breast and Body surgery at the City B location. He can market himself in City B (which he currently is not doing on his main site) but only for Breast and Body procedures and is not to compete for Facial surgery. Therefore, he needs this second site to not include content about Facial surgery. My concern is duplicate content. His request plan: the micro-site will be on different domain and C-block, the content, location keywords and meta data will be completely re-written and target City B. However, he wants to use the same theme of his main site - same source code, html/css, same top level navigation, same sub-navigation less the Face section, same images/graphics, same forms, etc. Is it okay to have the same exact site build on a different domain with rewritten copy (less a few pages) to target the same base keywords with only a different location? The site is intended for a different user group in City B, but I'm concerned the search engines won't like this and trigger the filters. I've read a bunch of duplicate content articles including this post panda by Dr. Pete. Great post, but doesn't really answer this particular issue of duplicating code for a related site. Can anyone make a case for or against this? Thanks in advance!
Technical SEO | | cmosnod0