Http and https protocols being indexed for e-commerce website
-
Hi team,
Our new e-commerce website has launched and I've noticed both http and https protocols are being indexed.
Our old website was http with only the necessary pages running https (cart, checkout etc). No https pages were indexed and you couldn't access a https page if you manually typed it into the browser.
We outrank our competition by a mile, so I'm treading carefully here and don't want to undo the progress we made on the old site, so I have a few questions:
1. How exactly do we remove one protocol from the index? We are running on Drupal. We tried a hard redirect from https to http and excluded the relevant pages (cart, login etc from the redirect), but found that you could still access https pages if you we're in the cart (https) and then pressed back on the browser button for example. At that point you could browse the entire site on https.
2. Is the safer option to emulate what we had in place on the old website e.g http with only the necessary pages being https, rather than making the switch to sitewide https?
I've been struggling with this one, so any help would be much appreciated.
Jake S
-
Just checked my GA data and you're right. Referral data from mountainjade.co.nz is there. Thanks for the heads up.
I've decided to make the switch to https, so will be organising that with dev in the coming few weeks. I'll keep you posted!
Cheers for the help again Logan,
I owe ya.
-
Great!
I've decided to make the full switch to https now, rather than wait to do it.
I will report back and let you know how it all goes!
Thanks for your help Laura.
-
I don't know why this didn't cross my mind until now, but having both versions can also mess up your Google Analytics data. Going from one to the other (can't remember which direction) creates a new session. You've probably got a lot of self-referring traffic showing up in your reports.
-
Hey Bas,
My developers share your sentiment!
Both versions of the website can be accessed by both the customer and the bots, but because we use relative urls, it can switch between http and https is a single session. This is one example:
1. Land on the homepage from a google search (http homepage is indexed).
2. Browse site on http. Add something to cart. Go to cart.
3. Cart switches to https. Navigate out of cart back into website.
4. Now urls are all https because the links on our site are relative and don't specify a protocol (e.g customer is in cart and then wants to check contact us page, it's link when clicked is as follows [Contact](/contact us). So it pulls the https protocol as there is not http protocol specified in that contact us link.
Hmmm, it definitely could be effecting UX and conversion.
-
Ideally, you'll migrate the entire site to https, and Cyrus' guide is a good one. Google has some helpful info for an http to https migration at https://support.google.com/webmasters/answer/6073543?hl=en.
The canonical tag solution is for the situation where you can't or don't want to go ahead and switch the whole site over to https right away. Either way, make sure Google knows, either through 301-redirects or canonical tags, that the http and https versions are the same page.
-
Hi Laura,
Wow, when I said we have self referencing canonicals in place (through Drupal Yoast) I hadn't even thought that it could be applying a canonical to the https version of the site aswell.
I just crawled both http and https and as you're right, the following is happening:
http://example.com is canonicalized to http://example.com
https://example.com is canonicalized to https://example.com
But I'm a little confused. In my first post I was looking for help because google was indexing both http and https pages. Are you saying that it's because of these canonicals that google is indexing both? Would it index both even if I didn't have the canonicals in place but still had SSL?
Just to confirm, canonicalizing the http URLs to the https URLs will tell google to fold the http URLs into the https and only index the https version of the site? Would I need to follow the https migration guide by Cyrus when doing this, or is this not really a 'migration' to https as we're not forcing the customer to browse in https?
Bear with me!
-
I agree with the others. I think you should pick a horse and ride it. Indecision is only causing more confusion on Google's part and is going to hurt you in the long run. Google says they prefer HTTPS and I've seen evidence of that. You're already paying for an SSL so you might as well use it to the max.
As Laura said, if you've got self-referring canonical tags on both secure and non-secure URLs, you're setting yourself up for some pretty big issues.
-
Hi Jacob,
I understand the issue. I think that this way you're not making a decision where you really should:
Either you use non-ssl or either you use ssl. To continue with the both is a terrible situation: nobody really knows what the they are supposed to know.For instance: is it possible that someone starts on the thomepage (non-ssl), goes to a product page (ssl) and then to the shopping cart which is again non-ssl? If that is the case you should really check your conversion rate because that in itself might be very damaging as well.
Yours,
Bas -
When you say you currently have self referencing canonicals, is the following happening?
The page http://example.com is canonicalized to http://example.com.
The page https://example.com is canonicalized to https://example.com.
If so, this is the bigger problem because Google sees these as 2 different URLs and may index both of them. Furthermore, you could be splitting backlinks between 2 URLs unnecessarily. This duplicate issue may be part of the reason you saw organic traffic drop when you launched your new site.
If the HTTPS URLs are already being indexed by Google, go ahead and canonicalize the http URLs to the https URLs. In other words, http://example.com will canonicalize to https://example.com.
By setting up the canonical this way, Google will fold the two URLs together and correctly treat them as the same page.
-
Good morning Laura,
Thanks for the advice.
I've replied below to Logan giving a little context. If you could take a look and let me know your thoughts it would be a huge help.
-
Hi again Logan,
I've tossed up whether or not to make the full switch to https for a while now. I'll give you a little background so you understand my position:
When our new website launched, our organic search traffic took a dip of around 15%. It has taken around two months for it to recover (almost). We changed site structure out of necessity but followed best practise to ensure we didn't undo alot of the work we had done with the old website. With the 15% organic rankings dip we saw a corresponding dip in revenue, so what I don't want to do is muddy the waters anymore than they already are by adding more moving parts to the mix (migration / redesign / http to https). And we cannot risk another dip in revenue so close to the first which may come with a full https migration (do you think?).
This is why I'm leaning toward replicating what we had in place on the old website and only forcing https on the necessary pages.
Now that you understand my position, would you still recommend the switch to https? I would love to know your thoughts.
The catch with all of this is I'm not sure exactly how the http https was implemented on the old website. At that point in time I had no need to know.
We currently have self referencing canonicals which you know we need to maintain, particularly on product pages which use URL parameters. We are also using relative links across the entire website.
Therefore, what would be the best solution here? Down the rabbit hole we go...
Thanks for your time,
-
Hi Jacob,
Cyrus Shepard put together a great guide on HTTPS migrations. Since you've already got an SSL, you may as well apply it to the whole site and set your preferred domain as HTTPS (as Laura and Bas mentioned). In the guide, he details the best ways to ensure search engines index the version you want via 301 redirect rules, canonical tags, and XML sitemaps. Don't forget to set up Search Console properties for HTTPS - www and non-www versions and set your preferred domain there as well.
Run this query in Google to monitor what they've got in their index as the canonical domain: info:mountainjade.co.nz
-
Agree with Laura: better to let the https be indexed. Nice links by the way for this topic.
Bas
-
In your case, the best thing to do is set up canonical tags to let Google know which version of the URL should be indexed. That way, it doesn't matter if Google can access the https page, and you won't have the duplicate content problem that you have now.
I can't advise you on the best way to set this up with Drupal, but you'll need to be wary of any type of automatic canonical tags. You may end up with an "http" canonical link on the http page and an "https" canonical link on the https page. That doesn't solve the problem at all.
If you are not already familiar with canonical tags, you can learn more at the links below.
- https://support.google.com/webmasters/answer/139066?hl=en
- https://moz.com/learn/seo/canonicalization
- https://webmasters.googleblog.com/2013/04/5-common-mistakes-with-relcanonical.html
By the way, I would set it up so that Google indexes the https version of your pages rather than the http version.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Schema for E-Commerce websites
Hi Guys. I am running a cleanup for the on page schema we use and will be moving the on page elements into tag manager. I have all the metas and schema for the products boxed off. My question today is what schema should I use for category pages. Granted there is Json-LD for aggregated reviews but I cant see or work out how or what to use for the category pages that have the lists of products on. Any assistance appreciated. Alex
Intermediate & Advanced SEO | | JBGlobalSEO1 -
Having issues crawling a website
We looked to use the Screaming Frog Tool to crawl this website and get a list of all meta-titles from the site, however, it only resulted with the one result - the homepage. We then sought to obtain a list of the URLs of the site by creating a sitemap using https://www.xml-sitemaps.com/. Once again however, we just go the one result - the homepage. There is something that seems to be restricting these tools from crawling all pages. If you anyone can shed some light as to what this could be, we'd be most appreciative.
Intermediate & Advanced SEO | | Gavo0 -
Google Indexing of Images
Our site is experiencing an issue with indexation of images. The site is real estate oriented. It has 238 listings with about 1190 images. The site submits two version (different sizes) of each image to Google, so there are about 2,400 images. Only several hundred are indexed. Can adding Microdata improve the indexation of the images? Our site map is submitting images that are on no-index listing pages to Google. As a result more than 2000 images have been submitted but only a few hundred have been indexed. How should the site map deal with images that reside on no-index pages? Do images that are part of pages that are set up as "no-index" need a special "no-index" label or special treatment? My concern is that so many images that not indexed could be a red flag showing poor quality content to Google. Is it worth investing in correcting this issue, or will correcting it result in little to no improvement in SEO? Thanks, Alan
Intermediate & Advanced SEO | | Kingalan10 -
A new website issue
Hello everybody,
Intermediate & Advanced SEO | | mtmaster
I have started a new website 22 days ago at the beginning of this month and i have long articles. I think this should make the site appear in search results for long tail keywords even if they are not very relevant but as you can see in the attached image from my webmaster tools the impression count has suddenly increased to 100 then significantly decreased again. Even when i cancel "filter" option. Is this normal for a 3 weeks old website? or there is something i have to check? thanks. cLMa04l.jpg0 -
How to structure articles on a website.
Hi All, Key to a successful website is quality content - so the Gods of Google tell me. Embrace your audience with quality feature rich articles on your products or services, hints and tips, how to, etc. So you build your article page with all the correct criteria; Long Tail Keyword or phrases hitting the URL, heading, 1st sentance, etc. My question is this
Intermediate & Advanced SEO | | Mark_Ch
Let's say you have 30 articles, where would you place the 30 articles for SEO purposes and user experiences. My thought are:
1] on the home page create a column with a clear heading "Useful articles" and populate the column with links to all 30 articles.
or
2] throughout your website create link references to the articles as part of natural information flow.
or
3] Create a banner or impact logo on the all pages to entice your audience to click and land on dedicated "articles page" Thanks Mark0 -
Increasing index
Hi! I'm having some trouble getting Google to index pages which once had a querystring in them but now are being redirected with a 301. The pages have a lot of unique content but this doesn't seem to matter. I feels as if there stuck in limbo (or a sandbox 🙂 Any clues on how to fix this? Thanks / Niklas
Intermediate & Advanced SEO | | KAN-Malmo0 -
Keyword density for a website
I wanna make landing pages for my web site .going to write 500 words article but how about keyword density .i wanna insert 4 keywords into that article.looking for expert advices .
Intermediate & Advanced SEO | | innofidelity0 -
Duplicate Content http://www.website.com and http://website.com
I'm getting duplicate content warnings for my site because the same pages are getting crawled twice? Once with http://www.website.com and once with http://website.com. I'm assuming this is a .htaccess problem so I'll post what mine looks like. I think installing WordPress in the root domain changed some of the settings I had before. My main site is primarily in HTML with a blog at http://www.website.com/blog/post-name BEGIN WordPress <ifmodule mod_rewrite.c="">RewriteEngine On
Intermediate & Advanced SEO | | thirdseo
RewriteBase /
RewriteRule ^index.php$ - [L]
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteRule . /index.php [L]</ifmodule> END WordPress0