Boost if loading website under 1 sec
-
Hello,
Can I increase my ranking if my website loads in under 1 second ?
Thank you,
-
Thank you for the detailed explanation. I agree with you there is probably more a penalty if you are slow than a bonus if you are fast or average.
-
I have increased our speed dramatically by getting an upgrade to our server from our hosting company. I noticed a big increase in traffic and ranking for the pages that were slower than 10s in total but you have to be aware that there are lots of different metrics surrounding speed and what you are seeing inside google analytics can sometimes be confusing.
For example, there are metrics for the time it takes for the entire page to load but I have a plugin called 'lazy-load' that loads all the above the fold content immediately and is smart about loading bigger items further down the page. There are lots of tools if you google 'waterfall speed test' or something similar which will show you precisely what's slowing the page up.
But under 5 seconds is fine and we have many number one national and local rankings loading in 5, 6 and sometimes 8 seconds depending on the volume of traffic.
It's more of a 'penalty for being slow' than a 'prize for being fast' although a really quick loading website is impressive from a user perspective.
The next month I migrated our site to HTTPS and saw an immediate bump in rankings across the board, so this is one where google is giving out algorithmic love - if you like.
Also be aware of mobile loading times vs desktop and make sure you divide it up into countries and localities. These types of data dimensions in google analytics will give you a much better picture.
I've found that if someone really likes what they see in the SERP, so a strong headline and meta description, nice data markup (schema) and additions like sitelinks or hyperlinks to their precise search then they will wait longer for a page to load. And the only way to really test this is with hotjar or a recording tool so you can see what users are actually doing on the site.
I would also look at bouncerate or 'adjusted bounce rate' if you google adjusted bounce rate you can set a timer after 10 or 20 seconds and discount people who are reading your article, getting what they want and then bouncing. Because these are not seen by google as bounces but are in analytics. It's east to set up and there's a guy called 'measureschool' where I learned how to do it, it's a step by step thing.
So I would consider focussing your energies elsewhere for a while to try to move the dial in the right direction. Because sub 1 second is crazy fast. Even for AMP.
-
Good to know that 5 seconds is still good. That you very much for the detailed explanation.
-
Well there are few "loadings website" parameters - TTFB (time to first byte), initial rendering or document complete.
It's great if TFFB is under 1s including network connection and DNS resolving.
It's fantastic if initial rendering is closer to 1 sec too.
Would be great if document complete is near 3-4-5 secs.But this won't help you with ranking... well just little bit even insignificant.
But that would help you in CRO because users want to visit faster sites. -
It is good to know. I thought getting from 4 second to 1 second would help but apparently it doesn't much but is more for the user from what I understand.
-
Nope. The use of page speed as a (minor, and very specific) ranking factor doesn't work that way. Never has.
Page speed only applies to a small percentage of search queries and it's used as a disqualifier for really slow pages (think over 15 or 20 seconds). Meaning if your page is really slow compared to your competitors, their page will likely be preferred over yours. But small incremental improvements in page speed won't result in ranking boosts. (And Google's on record as saying that even slower pages will earn SERP results if they're content is significantly better for the user's query.)
Improve page speed for the usability of your site, not because a small incremental speedup will change rankings.
Hope that helps?
Paul
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to stop some continuous attacks on our website
Hi Fellow SEO Experts
Intermediate & Advanced SEO | | leadstar0007
We would like to ask for your help in STOPPING some continuous attacks on our website.
It seems that we are constantly needing to Disavow these URL’S like http://www.econsultmgmt.com.my and http://www.unicontmt.com.br as they keep redirecting to our Sitemap. Is there any tool or any clever way to stop these constant and unwanted links to our sitemap please?
We have tried on MULTIPLE occasions to contact them and ask them to remove the content that has the hyperlinks back to our sitemap but without success ☹
Any assistance from our global colleagues would be greatly appreciated. Thanks & Regards, Manish0 -
Why does Moz recommend subdomains for language-specific websites?
In Moz's domain recommendations, they recommend subdirectories instead of subdomains (which agrees with my experience), but make an exception for language-specific websites: Since search engines keep different metrics for domains than they do subdomains, it is recommended that webmasters place link-worthy content like blogs in subfolders rather than subdomains. (i.e. www.example.com/blog/ rather than blog.example.com) The notable exceptions to this are language-specific websites. (i.e., en.example.com for the English version of the website). Why are language-specific websites excepted from this advice? Why are subdomains preferable for language-specific websites? Google's advice says subdirectories are fine for language-specific websites, and GSC allows geographic settings at the subdirectory level (which may or may not even be needed, since language-specific sites may not be geographic-specific), so I'm unsure why Moz would suggest using subdirectories in this case.
Intermediate & Advanced SEO | | AdamThompson0 -
How to know website is hit with panda or penguin?
My Website traffic and keywords dropped day by day. How can I know website is hit with panda or penguin. Website is - 24hourpassportandvisas. com
Intermediate & Advanced SEO | | bondhoward0 -
How important is the optional <priority>tag in an XML sitemap of your website? Can this help search engines understand the hierarchy of a website?</priority>
Can the <priority>tag be used to tell search engines the hierarchy of a site or should it be used to let search engines know which priority to we want pages to be indexed in?</priority>
Intermediate & Advanced SEO | | mycity4kids0 -
Same website, seperate subfolders or separete websites? 12 stores in two cities
I have a situation where there are 12 stores in separate suburbs across two cities. Currently the chain store has one eCommerce website. So I could keep the one website with all the attendant link building benefits of one domain. I would keep a separate webpage for each store with address details to assist with some Local SEO. But (1) each store has slightly different inventory and (2) I would like to garner the (Local) SEO benefits of being in a searchers suburb. So I'm wondering if I should go down the subfolder route with each store having its own eCommerce store and blog eg example.com/suburb? This is sort of what Apple does (albeit with countries) and is used as a best practice for international SEO (according to a moz seminar I watched awhile back). Or I could go down the separate eCommerce website domain track? However I feel that is too much effort for not much extra return. Any thoughts? Thanks, Bruce.
Intermediate & Advanced SEO | | BruceMcG0 -
Multiple Versions of Pages on One Website
Hi! My name is Sarah and I work for a brand design firm in Los Angeles. Currently we're working on a website redesign for our company. We have three pages of content that we want to add to the site, but are unsure if we will get penalized by Google if we add all of them since they may come off as too similar? The pages are: Branding
Intermediate & Advanced SEO | | Jawa
Personal Branding
Corporate Branding Does anyone know if our SEO will be penalized for having all three of these pages separately, or should we just focus on Branding, and include Personal Branding and Corporate Branding as sub categories on the page? Thanks! Sarah P.S. I should also say, we will have more than just the three aforementioned pages. It's going to be a big site with around 200+ pages. (Half of them being services, which is where the Branding, PB and CB pages will be located.)0 -
Sudden rank drop for 1 keyword
A page of mine (http://loginhelper.com/networks/facebook-login/) was ranking in the top 10 for keyword (facebook login) and has been for at least 2 months, moving between 5th and 10th. Suddenly in the last 3 days the rank for the keyword dropped from 7th to 46th, yet none of the other keywords have been affected (they target other pages) and their ranks have continued to improve. I am trying to figure out what caused this sudden drop in the ranking of 1 page (the page has quality mainly text based content and isn't in the least bit shallow or spammy) I have been thinking perhaps a crawl or server error may be to cause leaving the page temporarily unavailable or with a big load time... Otherwise what could cause one page to drop so much so quickly whilst other pages improved their rank?
Intermediate & Advanced SEO | | Netboost0 -
SeoMoz Crawler Shuts Down The Website Completely
Recently I have switched servers and was very happy about the outcome. However, every friday my site shuts down (not very cool if you are getting 700 unique visitors per day). Naturally I was very worried and digged deep to see what is causing it. Unfortunately, the direct answer was that is was coming from "rogerbot". (see sample below) Today (aug 5) Same thing happened but this time it was off for about 7 hours which did a lot of damage in terms of seo. I am inclined to shut down the seomoz service if I can't resolve this immediately. I guess my question is would there be a possibility to make sure this doesn't happen or time out like that because of roger bot. Please let me know if anyone has answer for this. I use your service a lot and I really need it. Here is what caused it from these error lines: 216.244.72.12 - - [29/Jul/2011:09:10:39 -0700] "GET /pregnancy/14-weeks-pregnant/ HTTP/1.1" 200 354 "-" "Mozilla/5.0 (compatible; rogerBot/1.0; UrlCrawler; http://www.seomoz.org/dp/rogerbot)" 216.244.72.11 - - [29/Jul/2011:09:10:37 -0700] "GET /pregnancy/17-weeks-pregnant/ HTTP/1.1" 200 51582 "-" "Mozilla/5.0 (compatible; rogerBot/1.0; UrlCrawler; http://www.seomoz.org/dp/rogerbot)"
Intermediate & Advanced SEO | | Jury0