Google indexing https sites by default now, where's the Moz blog about it!
-
Hello and good morning / happy Friday!
Last night an article from of all places " Venture Beat " titled " Google Search starts indexing and letting users stream Android apps without matching web content " was sent to me, as I read this I got a bit giddy. Since we had just implemented a full sitewide https cert rather than a cart only ssl.
I then quickly searched for other sources to see if this was indeed true, and the writing on the walls seems to indicate so.
Google - Google Webmaster Blog! - http://googlewebmastercentral.blogspot.in/2015/12/indexing-https-pages-by-default.html
http://www.searchenginejournal.com/google-to-prioritize-the-indexing-of-https-pages/147179/
http://www.tomshardware.com/news/google-indexing-https-by-default,30781.html
https://hacked.com/google-will-begin-indexing-httpsencrypted-pages-default/
https://www.seroundtable.com/google-app-indexing-documentation-updated-21345.html
I found it a bit ironic to read about this on mostly unsecured sites.
I wanted to hear about the 8 keypoint rules that google will factor in when ranking / indexing https pages from now on, and see what you all felt about this.
Google will now begin to index HTTPS equivalents of HTTP web pages, even when the former don’t have any links to them. However, Google will only index an HTTPS URL if it follows these conditions:
- It doesn’t contain insecure dependencies.
- It isn’t blocked from crawling by robots.txt.
- It doesn’t redirect users to or through an insecure HTTP page.
- It doesn’t have a rel="canonical" link to the HTTP page.
- It doesn’t contain a noindex robots meta tag.
- It doesn’t have on-host outlinks to HTTP URLs.
- The sitemaps lists the HTTPS URL, or doesn’t list the HTTP version of the URL.
- The server has a valid TLS certificate.
One rule that confuses me a bit is :
- **It doesn’t redirect users to or through an insecure HTTP page. **
Does this mean if you just moved over to https from http your site won't pick up the https boost? Since most sites in general have http redirects to https?
Thank you!
-
Can you please make a concrete example of a key-word for that you do not rank nicely. Please also specify the thing which in your opinion need to appear nicely inside the serch and the object for the blog of nextgenapk .
-
Thanks for your response, Peter! As I said, I could be totally wrong - glad I asked this question
Cheers!
-
-
_"Or you can leave but change their links to pass some URL shortener - bit.ly or t.co until they comes with HTTPS version." _
looking at it from technical standpoint, these shortners are also not https (when crawling. Would they not have the same effect as other non https links?
Sorry, I could be going totally wrong about this and this question doesnt make sense at all.
-
Touche, good sir, these are certainly some great ways to go about this. Especially number 3.
Thanks!
Wonder how long we got until http2 implementation...
-
Or you can leave but change their links to pass some URL shortener - bit.ly or t.co until they comes with HTTPS version.
Or you can also make some page as "partners" where you can link only HTTP external sites.
Or you can also make internal page redirector to HTTP site. Like HTTPS -> HTTPS (inside redirector and dummy page) -> HTTP. On this case redirector won't be indexed and that's why it's dummy.
And this is just three ideas that i think for one minute. Probably mine favorite is #3. But it's IMHO.
-
So if my manufacturers don't have https sites, I should remove the links to them since it's going to hinder indexing?
Thanks for the http redirecting to https response.
-
Some sites comes with redirectors or "beacons" for detecting user presence. Example i'm on site X page A and there i click on link to go on page B. But due marketing department this pass via HTTP redirector or pure HTTP (and there 301 redirect to HTTPS). Then this page B can be not indexed.
This mean that once you set sitewide 301 redirect to encrypted connection you must make few more steps:
- you must check all resources to pass via this encrypted channel. Images, CSS, JS - just anything.
- you must check canonical to be set to HTTPS
- you must check that link between pages to be also HTTPS
- you must see any 3rd party tools for encrypted connection. Can be analytics software or "tracking pixels" or heat maps or ads.
- you must check if outgoing links from your site can be via other sites with encryption. Can be Wikipedia, Moz, Google. Since everything there is already encrypted you will skip frustrating HTTPS -> HTTP -> HTTPS jump too.
So then your site can be indexed in HTTPS. It's tricky procedure with many traps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate website pages indexed: Ranking dropped. Does Google checks the duplicate domain association?
Hi all, Our duplicate website which is used for testing new optimisations got indexed and we dropped in rankings. But I am not sure whether this is exact reason as it happened earlier too where I don't find much drop in rankings. Also I got replies in the past that it'll not really impact original website but duplicate website. I think this rule applies to the third party websites. But if our own domain has exact duplicate content; will Google knows that we own the website from any other way we are associated like IP addresses and servers, etc..to find the duplicate website is hosted by us? I wonder how Google treats duplicate content from third party domains and own domains. Thanks
Algorithm Updates | | vtmoz0 -
Meta Descriptions - Google ignores what we have
Hi I still write meta descriptions to help with CTR. I am currently looking at a page where the CTR needs improving. I check the meta on Google SERPs & it isn't pulling through the meta description we have - but other info on the page. This isn't ideal - why does this happen? Will Google just make the decision and are descriptions not worth writing?
Algorithm Updates | | BeckyKey0 -
Help Me Change My Client's Mind
My client wants to build a second site to provide targeted links for SEO to his main site. He's interested in buying a TLD with some near topic authority/links and then build the second site's authority up from there. He is clear that this could get him in trouble for a link scheme, but thinks it can all be hidden from Google. Off the top of my head I was able to recall a few of the pain-in-the-neck things you'd have to do to not get caught, but he seemed unconvinced. I recall you'd have to have: Different registrar Different contact/WhoIs Different site host Different G/A, GWT Logging into second's site's G/A, GWT with different IP address not used for main domain With the exception of the last one, he didn't seem to think it would be too hard. Aren't there more difficult maneuvers required for hiding this from Google? I want to be able to point out to him how ridiculous this low integrity effort will be, without losing the client. Thanks! Best... Darcy
Algorithm Updates | | 945010 -
Best practice for cleaning up multiple Google Places listings and multiple Google accounts when logins were lost.
We are an inbound marketing agency, most of our clients are not relying on local seo. I have a pretty good understanding of it when starting fresh but not so much in joining a "movie in progress" kind of scenario. Recently we've brought on two clients who have had their websites in place for awhile, have made small attempts at marketing themselves online over the years and its resulted in multiple Google places listings, variations of the company names (one of them changed their name), worried there are yet more accounts out there they aren't aware of, etc (analytics, and others from well intentioned employees and past service providers - no internal leadership at the company level). In reading Google help forums I'm seeing some recently having their accounts suspended when they try to clean things up - in one case a person setup a new Google account thinking he would start fresh and in trying to claim listings, get rid of duplicates, etc. his account was suspended. What is the CURRENT recommended course of action in situations like these? With all the changes going on with Google, I don't know which route to take and have combed the Internet reading articles about this (including Google's resources) - would like some current real world advise.
Algorithm Updates | | rhgraves651 -
Phantom Indexed: 301 Redirected Old URL Shows in Google Search Result!
Today, I have read about Phantom Indexed in Google search result. Because, I was searching about 301 redirect due to indexing of 301 redirected old URLs in Google search result rather than new landing pages. I've added my comment on jennita's blog post about 301 redirect. I would like to paste similar question over here! I have 301 redirected following 3 domains to new website... http://www.lampslightingandmore.com/ To http://www.vistastores.com/table-lamps http://www.vistapatioumbrellas.com/ To http://www.vistastores.com/patio-umbrellas http://www.spiderofficechairs.com/ To http://www.vistastores.com/office-chairs I have done it before 3 months but, Google still shows me home page URL in search result rather than new landing page. You can check following search results to know more about it. For LampsLightingandMore ~ On second or third page::: For VistaPatioUmbrellas ~ On second or third page::: For SpiderOfficeChairs ~ On Second or third page::: I come to know about Phantom Indexed after raised my comment over there. So, why should not start discussion on it. Because, It's all about branding and who'll love to hang old address in front of new home.
Algorithm Updates | | CommercePundit0 -
Google Rankings Jumping Around
Hi, Since January, the Google rankings for one of our sites has been jumping around. Sometimes it's on page 1, then it disappears and comes back around 1 month later. It's strange because it's only a small section of the site that it's happening to. Every other section of the site is doing really well. Just wondered if anyone else is having this problem, or has had it and can suggest any fixes. There are no technical issues, no changes have been made to the site, all I can think is it's Google messing around with their algorithm? Any help or advice would be much appreciated. Karen
Algorithm Updates | | Digirank0 -
How is important domain age now?
Just as topic states, how important do you think domain age is? Do you think getting old domain for the new website will do any difference? Lets say website A used to sell apples... went out of business after 5 years of business. If I would get that domain and will star selling apples again can I expect some extra boost because of the age? Cheers
Algorithm Updates | | DiamondJewelryEmpire0 -
If we are getting clicks from a local one box as a citation in the serps's would we see this as the referrer in GA?
If we are getting clicks from a local one box as a citation in the serps's
Algorithm Updates | | Mediative
would we see this as the referrer in GA?0