Landing pages showing up as HTTPS when we haven't made the switch
-
Hi Moz Community,
Recently our tech team has been taking steps to switch our site from http to https. The tech team has looked at all SEO redirect requirements and we're confident about this switch, we're not planning to roll anything out until a month from now.
However, I recently noticed a few https versions of our landing pages showing up in search. We haven't pushed any changes out to production yet so this shouldn't be happening. Not all of the landing pages are https, only a select few and I can't see a pattern. This is messing up our GA and Search Console tracking since we haven't fully set up https tracking yet because we were not expecting some of these pages to change.
HTTPS has always been supported on our site but never indexed so it's never shown up in the search results. I looked at our current site and it looks like landing page canonicals are already pointing to their https version, this may be the problem.
Anyone have any other ideas?
-
What I would do is the following: change the rel canonical back, remove the https version from Search Console (you need to add the https version of the website as well in Search Console) and then fetch and reindex the http version (also from Search Console). So basically, help Google understand this mistake and go back to the http version. Also, check your sitemaps and be sure that you are not including https links there. Hope this helps.
-
Hi Christian,
Thanks for the reply. HTTPS rel canonical were added to live pages, as I expected this is why some are showing up in the search results. It's a problem through for GA and Search console tracking since we haven't made the switch server side yet and currently http pages don't redirect to their https version yet. So we're seeing no sessions for our http versions.
If I change the rel=canonical back to http on the live site I'm guessing the non secure pages will show up again after being crawled?
Thanks!
-
Hi! I don't seem to understand the question. Is it that you added a https rel canonical to live pages and are wondering why it is indexed? If so, this is the normal behavior since your website already supports https and you have linked to it. The reason why only a few landing pages show up as https for now might be related to how and when the crawler got there. I hope I didn't totally misunderstand the question.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
A new client has image urls showing above their page rankings for the same key phrase.
New client website https://yorkshirefoodguide.co.uk/ has for some key phrase searches the URL for an image showing above or as well as the url for the landing page. I'd be happy for it to show in the image pack but I want to url to rank in the main serp. The site is in WordPress and I'm sure this is just a setting I need to manage. Can you help please?
Technical SEO | | Marketing_Optimist0 -
What's the best way to test Angular JS heavy page for SEO?
Hi Moz community, Our tech team has recently decided to try switching our product pages to be JavaScript dependent, this includes links, product descriptions and things like breadcrumbs in JS. Given my concerns, they will create a proof of concept with a few product pages in a QA environment so I can test the SEO implications of these changes. They are planning to use Angular 5 client side rendering without any prerendering. I suggested universal but they said the lift was too great, so we're testing to see if this works. I've read a lot of the articles in this guide to all things SEO and JS and am fairly confident in understanding when a site uses JS and how to troubleshoot to make sure everything is getting crawled and indexed. https://sitebulb.com/resources/guides/javascript-seo-resources/ However, I am not sure I'll be able to test the QA pages since they aren't indexable and lives behind a login. I will be able to crawl the page using Screaming Frog but that's generally regarded as what a crawler should be able to crawl and not really what Googlebot will actually be able to crawl and index. Any thoughts on this, is this concern valid? Thanks!
Technical SEO | | znotes0 -
Email and landing page duplicate content issue?
Hi Mozers, my question is, if there is a web based email that goes to subscribers, then if they click on a link it lands on a Wordpress page with very similar content, will Google penalize us for duplicate content? If so is the best workaround to make the email no index no follow? Thanks!
Technical SEO | | CalamityJane770 -
Switching to HTTPS
Hey Moz Community! I am about to switch my website from http to https. I am wondering if I need to create a 301 redirect of every single page on my site, from the http address to the https url? Or if setting one master that redirects all traffic to the https version. Obviously I am concerned about losing rankings by switching. Code for https redirect of all RewriteEngine On
Technical SEO | | SeanConroy
RewriteCond %{HTTPS} off
RewriteRule (.*) https://%{SERVER_NAME}/$1 [R,L]1 -
Why Can't Googlebot Fetch Its Own Map on Our Site?
I created a custom map using google maps creator and I embedded it on our site. However, when I ran the fetch and render through Search Console, it said it was blocked by our robots.txt file. I read in the Search Console Help section that: 'For resources blocked by robots.txt files that you don't own, reach out to the resource site owners and ask them to unblock those resources to Googlebot." I did not setup our robtos.txt file. However, I can't imagine it would be setup to block google from crawling a map. i will look into that, but before I go messing with it (since I'm not familiar with it) does google automatically block their maps from their own googlebot? Has anyone encountered this before? Here is what the robot.txt file says in Search Console: User-agent: * Allow: /maps/api/js? Allow: /maps/api/js/DirectionsService.Route Allow: /maps/api/js/DistanceMatrixService.GetDistanceMatrix Allow: /maps/api/js/ElevationService.GetElevationForLine Allow: /maps/api/js/GeocodeService.Search Allow: /maps/api/js/KmlOverlayService.GetFeature Allow: /maps/api/js/KmlOverlayService.GetOverlays Allow: /maps/api/js/LayersService.GetFeature Disallow: / Any assistance would be greatly appreciated. Thanks, Ruben
Technical SEO | | KempRugeLawGroup1 -
Will it make any difference to SEO on an ecommerce site if they use their SSL certificate (https) across every page
I know that e-commerce sites usually have SSL certificates on their payment pages. A site I have come across is using has the https: prefix to every page on their site. I'm just wondering if this will make any difference to the site in the eyes of Search Engines, and whether it could effect the rankings of the site?
Technical SEO | | Sayers1 -
Google Penguin - Target Landing Page
Hello, One of our sites have been hit by the first penguin update back in April and ever since then we have been removing links and submitting reconsideration requests... It only seems to have affected our home page as some of our internal landing pages are still ranking OK in the SERPS #1 / #2. I'm just wondering if we created a landing page for this keyword and drove high quality / relevant links to this landing page could we get it to rank higher than our homepage even though our Homepage is on the 5th page.Hope the above make sense. Has anybody had any joy with this?
Technical SEO | | ScottBaxterWW0 -
We changed the URL structure 10 weeks ago and Google hasn't indexed it yet...
We recently modified the whole URL structure on our website, which resulted in huge amount of 404 pages changing them to nice human readable urls. We did this in the middle of March - about 10 weeks ago... We used to have around 5000 404 pages in the beginning, but this number is decreasing slowly. (We have around 3000 now). On some parts of the website we have also set up a 301 redirect from the old URLs to the new ones, to avoid showing a 404 page thus making the “indexing transmission”, but it doesn’t seem to have made any difference. We've lost a significant amount of traffic, because of the URL changes, as Google removed the old URLs, but hasn’t indexed our new URLs yet. Is there anything else we can do to get our website indexed with the new URL structure quicker? It might also be useful to know that we are a page rank 4 and have over 30,000 unique users a month so I am sure Google often comes to the site quite often and pages we have made since then that only have the new url structure are indexed within hours sometimes they appear in search the next day!
Technical SEO | | jack860