Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
GoogleBot still crawling HTTP/1.1 years after website moved to HTTP/2
-
Whole website moved to https://www. HTTP/2 version 3 years ago.
When we review log files, it is clear that - for the home page - GoogleBot continues to only access via HTTP/1.1 protocol
-
Robots file is correct (simply allowing all and referring to https://www. sitemap
-
Sitemap is referencing https://www. pages including homepage
-
Hosting provider has confirmed server is correctly configured to support HTTP/2 and provided evidence of accessing via HTTP/2 working
-
301 redirects set up for non-secure and non-www versions of website all to https://www. version
-
Not using a CDN or proxy
-
GSC reports home page as correctly indexed (with https://www. version canonicalised) but does still have the non-secure version of website as the referring page in the Discovery section. GSC also reports homepage as being crawled every day or so.
Totally understand it can take time to update index, but we are at a complete loss to understand why GoogleBot continues to only go through HTTP/1.1 version not 2
Possibly related issue - and of course what is causing concern - is that new pages of site seem to index and perform well in SERP ... except home page. This never makes it to page 1 (other than for brand name) despite rating multiples higher in terms of content, speed etc than other pages which still get indexed in preference to home page.
Any thoughts, further tests, ideas, direction or anything will be much appreciated!
-
-
Quoting here, to ask again, why this is happening with out pages too? is Google going crazy or what?
@James-Avery said in GoogleBot still crawling HTTP/1.1 years after website moved to HTTP/2:
@AKCAC said in GoogleBot still crawling HTTP/1.1 years after website moved to HTTP/2:
Whole website moved to https://www. HTTP/2 version 3 years ago.
When we review log files, it is clear that - for the home page - GoogleBot continues to only access via HTTP/1.1 protocol
-
Robots file is correct (simply allowing all and referring to https://www. sitemap
-
Sitemap is referencing https://www. pages including homepage
-
Hosting provider has confirmed server is correctly configured to support HTTP/2 and provided evidence of accessing via HTTP/2 working
-
301 redirects set up for non-secure and non-www versions of website all to https://www. version
-
Not using a CDN or proxy
-
GSC reports home page as correctly indexed (with https://www. version canonicalised) but does still have the non-secure version of website as the referring page in the Discovery section. GSC also reports homepage as being crawled every day or so.
Totally understand it can take time to update such as our page at backwards 3 index, but we are at a complete loss to understand why GoogleBot continues to only go through HTTP/1.1 version not 2
Possibly related issue - and of course what is causing concern - is that new pages of site seem to index and perform well in SERP ... except home page. This never makes it to page 1 (other than for brand name) despite rating multiples higher in terms of content, speed etc than other pages which still get indexed in preference to home page.
Any thoughts, further tests, ideas, direction or anything will be much appreciated!
First off, it's great that your entire website made the transition to HTTPS and HTTP/2 three years ago. That's definitely a step in the right direction for performance and security.
Since your hosting provider has confirmed that the server is configured correctly for HTTP/2 and you've got the 301 redirects set up properly, it's puzzling why GoogleBot is still sticking to HTTP/1.1 for accessing the homepage. One thing you might want to double-check is if there are any specific directives in your server configuration that could be affecting how GoogleBot accesses your site. Sometimes, even seemingly minor configurations can have unintended consequences.
Regarding the non-secure version of your website still showing up in the Discovery section of Google Search Console (GSC), despite the homepage being correctly indexed with the HTTPS version, it could be a matter of Google's index taking some time to catch up. However, it's worth investigating further to ensure there aren't any lingering issues causing this discrepancy.
As for the home page not ranking as well in SERPs compared to other pages, despite having better content and speed, this could be due to a variety of factors. It's possible that Google's algorithms are prioritizing other pages for certain keywords or that there are specific technical issues with the homepage that are affecting its visibility.
In terms of next steps, I'd recommend continuing to monitor the situation closely and perhaps reaching out to Google's support team for further assistance. They may be able to provide additional insights or suggestions for resolving these issues.
Overall, it sounds like you've done a thorough job of troubleshooting so far, but sometimes these technical SEO mysteries require a bit of persistence to unravel. Keep at it, and hopefully, you'll be able to get to the bottom of these issues soon!
-
-
@john1408 said in GoogleBot still crawling HTTP/1.1 years after website moved to HTTP/2:
@AKCAC said in GoogleBot still crawling HTTP/1.1 years after website moved to HTTP/2:
Whole website moved to https://www. HTTP/2 version 3 years ago.
When we review log files, it is clear that - for the home page - GoogleBot continues to only access via HTTP/1.1 protocol
-
Robots file is correct (simply allowing all and referring to https://www. sitemap
-
Sitemap is referencing https://www. pages including homepage
-
Hosting provider has confirmed server is correctly configured to support HTTP/2 and provided evidence of accessing via HTTP/2 working
-
301 redirects set up for non-secure and non-www versions of website all to https://www. version
-
Not using a CDN or proxy
-
GSC reports home page as correctly indexed (with https://www. version canonicalised) but does still have the non-secure version of website as the referring page in the Discovery section. GSC also reports homepage as being crawled every day or so.
Totally understand it can take time to update index, but we are at a complete loss to understand why GoogleBot continues to only go through HTTP/1.1 version not 2
Possibly related issue - and of course what is causing concern - is that new pages of site seem to index and perform well in SERP ... except home page. This never makes it to page 1 (other than for brand name) despite rating multiples higher in terms of content, speed etc than other pages which still get indexed in preference to home page.
Any thoughts, further tests, ideas, direction or anything will be much appreciated!
It's baffling that GoogleBot persists with HTTP/1.1 for the homepage despite proper setup. Consider exploring Google Search Console further for indexing insights, and reach out to Google Support for assistance in resolving this unusual behavior.
-
-
@AKCAC said in GoogleBot still crawling HTTP/1.1 years after website moved to HTTP/2:
Whole website moved to https://www. HTTP/2 version 3 years ago.
When we review log files, it is clear that - for the home page - GoogleBot continues to only access via HTTP/1.1 protocol
-
Robots file is correct (simply allowing all and referring to https://www. sitemap
-
Sitemap is referencing https://www. pages including homepage
-
Hosting provider has confirmed server is correctly configured to support HTTP/2 and provided evidence of accessing via HTTP/2 working
-
301 redirects set up for non-secure and non-www versions of website all to https://www. version
-
Not using a CDN or proxy
-
GSC reports home page as correctly indexed (with https://www. version canonicalised) but does still have the non-secure version of website as the referring page in the Discovery section. GSC also reports homepage as being crawled every day or so.
Totally understand it can take time to update backwards 3 index, but we are at a complete loss to understand why GoogleBot continues to only go through HTTP/1.1 version not 2
Possibly related issue - and of course what is causing concern - is that new pages of site seem to index and perform well in SERP ... except home page. This never makes it to page 1 (other than for brand name) despite rating multiples higher in terms of content, speed etc than other pages which still get indexed in preference to home page.
Any thoughts, further tests, ideas, direction or anything will be much appreciated!
First off, it's great that your entire website made the transition to HTTPS and HTTP/2 three years ago. That's definitely a step in the right direction for performance and security.
Since your hosting provider has confirmed that the server is configured correctly for HTTP/2 and you've got the 301 redirects set up properly, it's puzzling why GoogleBot is still sticking to HTTP/1.1 for accessing the homepage. One thing you might want to double-check is if there are any specific directives in your server configuration that could be affecting how GoogleBot accesses your site. Sometimes, even seemingly minor configurations can have unintended consequences.
Regarding the non-secure version of your website still showing up in the Discovery section of Google Search Console (GSC), despite the homepage being correctly indexed with the HTTPS version, it could be a matter of Google's index taking some time to catch up. However, it's worth investigating further to ensure there aren't any lingering issues causing this discrepancy.
As for the home page not ranking as well in SERPs compared to other pages, despite having better content and speed, this could be due to a variety of factors. It's possible that Google's algorithms are prioritizing other pages for certain keywords or that there are specific technical issues with the homepage that are affecting its visibility.
In terms of next steps, I'd recommend continuing to monitor the situation closely and perhaps reaching out to Google's support team for further assistance. They may be able to provide additional insights or suggestions for resolving these issues.
Overall, it sounds like you've done a thorough job of troubleshooting so far, but sometimes these technical SEO mysteries require a bit of persistence to unravel. Keep at it, and hopefully, you'll be able to get to the bottom of these issues soon!
-
-
@AKCAC said in GoogleBot still crawling HTTP/1.1 years after website moved to HTTP/2:
Whole website moved to https://www. HTTP/2 version 3 years ago.
When we review log files, it is clear that - for the home page - GoogleBot continues to only access via HTTP/1.1 protocolRobots file is correct (simply allowing all and referring to https://www. sitemap
Sitemap is referencing https://www. pages including homepage
Hosting provider has confirmed server is correctly configured to support HTTP/2 and provided evidence of accessing via HTTP/2 working
301 redirects set up for non-secure and non-www versions of website all to https://www. version
Not using a CDN or proxy
GSC reports home page as correctly indexed (with https://www. version canonicalised) but does still have the non-secure version of website as the referring page in the Discovery section. GSC also reports homepage as being crawled every day or so.
Totally understand it can take time to update index, but we are at a complete loss to understand why GoogleBot continues to only go through HTTP/1.1 version not 2
Possibly related issue - and of course what is causing concern - is that new pages of site seem to index and perform well in SERP ... except home page. This never makes it to page 1 (other than for brand name) despite rating multiples higher in terms of content, speed etc than other pages which still get indexed in preference to home page.
Any thoughts, further tests, ideas, direction or anything will be much appreciated!t seems like you've taken several steps to ensure the correct protocol (HTTP/2) for your website, and it's puzzling that GoogleBot still accesses the home page via HTTP/1.1. A few additional suggestions:
Crawl Rate Settings: Check your Google Search Console (GSC) for crawl rate settings. Google might be intentionally crawling your site slowly.
Server Logs: Reanalyze server logs to confirm that GoogleBot is indeed accessing via HTTP/1.1 for the home page. This could help identify patterns or anomalies.
Mobile Usability: Ensure your home page is mobile-friendly. Google tends to prioritize mobile indexing.
Fetch and Render Tool: Use GSC's Fetch and Render tool to see how Google renders your home page. It might provide insights into how Google sees your page.
Structured Data and Markup: Ensure structured data and markup on your home page are correct and up-to-date.
Manual Submission: Consider manually requesting indexing for your home page through GSC.
Regarding the new pages performing well compared to the home page, it might be worth revisiting your on-page SEO elements and analyzing the competition for relevant keywords.
-
@AKCAC said in GoogleBot still crawling HTTP/1.1 years after website moved to HTTP/2:
Whole website moved to https://www. HTTP/2 version 3 years ago.
When we review log files, it is clear that - for the home page - GoogleBot continues to only access via HTTP/1.1 protocol
-
Robots file is correct (simply allowing all and referring to https://www. sitemap
-
Sitemap is referencing https://www. pages including homepage
-
Hosting provider has confirmed server is correctly configured to support HTTP/2 and provided evidence of accessing via HTTP/2 working
-
301 redirects set up for non-secure and non-www versions of website all to https://www. version
-
Not using a CDN or proxy
-
GSC reports home page as correctly indexed (with https://www. version canonicalised) but does still have the non-secure version of website as the referring page in the Discovery section. GSC also reports homepage as being crawled every day or so.
Totally understand it can take time to update index, but we are at a complete loss to understand why GoogleBot continues to only go through HTTP/1.1 version not 2
Possibly related issue - and of course what is causing concern - is that new pages of site seem to index and perform well in SERP ... except home page. This never makes it to page 1 (other than for brand name) despite rating multiples higher in terms of content, speed etc than other pages which still get indexed in preference to home page.
Any thoughts, further tests, ideas, direction or anything will be much appreciated!
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How Can I influence the Google Selected Canonical
Our company recently rebranded and launched a new website. The website was developed by an overseas team and they created the test site on their subdomain. The only problem is that Google crawled and indexed their site and ours. I noticed Google indexed their sub domain ahead of our domain and based on Search Console it has deemed our content as the duplicate of theirs and the Google selected theirs as the canonical.
Community | | Spaziohouston
The website in question is https://www.spaziointerni.us
What would be the best course of action to get our content ranked and selected instead of being marked as the duplicate?
Not sure if I have to modify the content to make it more unique or have them submit a removal in their search console.
Our indexed pages continue to go down due to this issue.
Any help is greatly appreciated.1 -
How to index e-commerce marketplace product pages
Hello! We are an online marketplace that submitted our sitemap through Google Search Console 2 weeks ago. Although the sitemap has been submitted successfully, out of ~10000 links (we have ~10000 product pages), we only have 25 that have been indexed. I've attached images of the reasons given for not indexing the platform. gsc-dashboard-1 gsc-dashboard-2 How would we go about fixing this?
Technical SEO | | fbcosta0 -
English pages given preference over local language
We recently launched a new design of our website and for SEO purposes we decided to have our website both in English and in Dutch. However, when I look at the rankings in MOZ for many of our keywords, it seems the English pages are being preferred over the Dutch ones. That never used to be the case when we had our website in the old design. It mainly is for pages that have an English keyword attached to them, but even then the Dutch page would just rank. I'm trying to figure out why English pages are being preferred now and whether that could actually damage our rankings, as search engines would prefer copy in the local language. An example is this page: https://www.bluebillywig.com/nl/html5-video-player/ for the keywords "HTML5 player" and "HTML5 video player".
Local SEO | | Billywig0 -
Unsolved Is Performance Metrics only available in a Campaign?
I'm looking to do a 1-off Performance Metrics analysis across dozens of pages on a single website - a prospective client. I thought it would be part of the On-Demand Crawl.
Moz Tools | | amandacash858960 -
Is page speed important to improve SEO ranking?
I saw on a SEO Agency's site (https://burstdgtl.com/search-engine-optimization/) that page speed apparently affects Google ranking. Is this true? And if it is, how do I improve it, do I need an agency?
On-Page Optimization | | jasparcj0 -
WordPress - How to stop both http:// and https:// pages being indexed?
Just published a static page 2 days ago on WordPress site but noticed that Google has indexed both http:// and https:// url's. Usually I only get http:// indexed though. Could anyone please explain why this may have happened and how I can fix? Thanks!
Technical SEO | | Clicksjim1 -
Can you have a /sitemap.xml and /sitemap.html on the same site?
Thanks in advance for any responses; we really appreciate the expertise of the SEOmoz community! My question: Since the file extensions are different, can a site have both a /sitemap.xml and /sitemap.html both siting at the root domain? For example, we've already put the html sitemap in place here: https://www.pioneermilitaryloans.com/sitemap Now, we're considering adding an XML sitemap. I know standard practice is to load it at the root (www.example.com/sitemap.xml), but am wondering if this will cause conflicts. I've been unable to find this topic addressed anywhere, or any real-life examples of sites currently doing this. What do you think?
Technical SEO | | PioneerServices0