Page speed - what do you aim for?
-
Hi Mozzers - was just looking at website speed and know the google guidelines on average page load time but I'm not sure whether Google issues guidelines on any of the other 4? Do you know of any guidance on domain lookup, server response, server connection or page download?
Page Load Time (sec) - I tend to aim for 2 seconds max: http://www.hobo-web.co.uk/your-website-design-should-load-in-4-seconds/
Server Response Time: [Google recommends 200ms]: https://developers.google.com/speed/docs/insights/Server Redirection Time (sec) [dependent on number of redirects so probably no guide figure]
Domain Lookup Time (sec)
Server Connection Time (sec)
Page Download Time (sec)Thanks, Luke
-
IMHO, if somebody is paying us for SEO, then our GOAL is to get the homepage to load in a second or less.... especially if most of the users are mobile. If it's mid 1 second, then we can grudgingly live with that.
I'm glad you asked about server response times.... for most sites, after the content is optimized ( smaller images, clunky code, etc...) the initial server response time is usually the culprit for getting over a second.... as long as the rest of the home page is "light". Light to us is under 1MB. Depending on your CMS, there are a variety of ways to get the response time to be 200ms or less.
Google Pagespeed, as David said, is a good measurement, but it's not the holy grail of measurements. We use it only to identify areas that need improvement. Waterfalls tell us what's taking so long and what's heavy.
You didn't ask about plugins - which is a major culprit to caching, minify errors, conflicts, speed and weight. We limit all active plugins to TEN (including caching, SEO, security). For some sites, plugin clean up is the easiest way to speed up a site.
At the end of the day, nothing beats clean code, light images and a lightening fast server.
-
Thanks for all the feedback everyone - much appreciated, Luke
-
As long as the page loads quick for users then I wouldn't put a huge focus on this. True that Google looks at page load speed, but I wouldn't put all your eggs in that basket. We have sites that show a 2.5-3.5 second load time, and they still dominate ranking results. Focus on creating a better experience.
One of the simple ways to speed up load times is to minify and compress CSS and Javascript files as small as possible, but be sure to check that the minification does not break areas of the site. We have seen improvements as high as 75% just from completing this step alone.
If you have a Joomla or WordPress website, here is a great plugin that will do this for you: https://www.jch-optimize.net/
-
Hi Luke! When using this tool (https://developers.google.com/speed/pagespeed/insights/) we aim to have our clients above 80 for both mobile and desktop.
-
I will be honest, I don't trust Google with PageSpeed. There are too few questions asked about how it actually fits in with the metrics and what is used. One example is Google says resources like Google Analytics do not count against your score in the SERPS. But in the test they do. If you use several Google assets like Adwords, Analytics, fonts, ect; you will show a very low score. Using them will actually block you from seeing other things that you can fix.
What we have started doing is figuring out what Google actually needs and presenting it to them. We started hiding tracking codes from Google. Bing, Facebook, ect from them. We hide our analytics tracking script from Google's crawler. I figured out that Google's test servers have the FA library and also their font library locally on the machine, so we have started hiding those from it as well. Any 3rd party script we have that Google does not need to see has been hidden as well, segmentify, olark, anything really. Doing these things has raised our score quite a bit.
-
"if you have 2 or even 3 redirects mobile users wait for 5 seconds before see anything. Hint - that's why i won't click on most bit.ly, ow.ly, goo.gl links in Twitter, Facebook, G+ when i'm on mobile. Because they first pass via t.co redirect then redirect that i can see and sometime even 3rd redirect."
Just adding a bit of weight to what you said, here's a test of a t.co link through bit.ly: https://i.gyazo.com/ca87c486a903914c2b058612cc93f3f0.png on 3g, it's 4.27s to even start loading Google. Without t.co: https://i.gyazo.com/f22c18a0879f76ecf653662153e17c43.png which is 2.35s.
-
Pagespeed score means nothing unfortunately. http/2 puts a spanner in the works for a lot of it.
https://blog.newrelic.com/2016/02/09/http2-best-practices-web-performance/
Being this section:
- Concatenating JavaScript and CSS files: Combining smaller files into a larger file to reduce the total number of requests.
- **Image spriting: **Combining multiple small images into one larger image.
- Domain sharding: Spreading requests for static assets across several domains to increase the total number of open TCP connections allowed by the browser.
- Inlining assets: Bundling assets with the HTML document source, including base-64 encoding images or writing JavaScript code directly inside
-
It's hard to be explained but "Less is MORE!" in general for that numbers.
Examples - redirectors. Redirects can overkill your site specially on mobile users. For that even simply site redirect can took second or two. Example www.example.com -> 301 -> m.example.com; looks simple isn't? But in reality after client took 301 redirect he must make new domain resolving (for m.exmaple.com) and then new connect to new server (m.example.com). And this is simply case... if you have 2 or even 3 redirects mobile users wait for 5 seconds before see anything. Hint - that's why i won't click on most bit.ly, ow.ly, goo.gl links in Twitter, Facebook, G+ when i'm on mobile. Because they first pass via t.co redirect then redirect that i can see and sometime even 3rd redirect. I know that marketers want to see "clicks", but isn't good for mobile users.
Server connection is also need to be less. But this mean that server need to be closer to user. Best example is Australia. There even simply DNS resolving + connection took one second. And client doesn't receive single byte from server yet... You can see WebPageTest.org (there are Australian servers). But of course providing single server there is expensive, so you need to have deep pockets to make servers there. That's why most of companies providing CDN support. Since CDN endpoint is closer to user it make things little bit faster for them. And if CDN is setup correct should be much faster.
So - idea is "Less is More!". The best is if you use WPT to benchmark your site from all over the world. And also setup Analytics to count speed. Because it's different speed when your site is on perfect conditions in datacenter than in real world.
-
Hi Luke,
Here is what google recommends in terms of page speed. Server response time to be less than 200 ms.
Now, coming to the Page Speed tool / Insight that google provide the measure the page speed ratings (1-100) , Google Page Speed score is indeed a strong indicator of a website’s loading performance in terms of time.
As per my research, total website download less than 10 secs corresponds to 75-85 on pagespeed score.
I hope this helps.
Thanks,
Vijay
-
Thanks Tom for picking up on that error - ugh - corrected now. Brain working sluggishly this morning lol!
-
Hi Luke,
"Avg. Page Load Time (sec) [Google recommends 200ms]:" That's actually for the server response time.
Personally, the only thing that matters is that the overall page load time is quick. I aim if possible for sub 2 seconds for any page.
Tom
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Issue with AMP pages
Hello, We have implemented AMP on our blog pages, but now some of the Web pages are also being shown like AMP pages. ( no footer and no navigation ) What could have gone wrong ? Thanks
Intermediate & Advanced SEO | | Johnroger0 -
Home page vs inner page?
do you believe that the advantage of targeting a search term on the home page is now worse off than before? as I understand it ctr is a big factor now And as far as i can see if two pages are equal on page etc the better ctr will win out, the issue with the home page is the serp stars cannot be used hence the ctr on a product page will be higher? I feel if you where able to get a home page up quicker (1 year instead of two) you still lost out in the end due to the product page winning on ctr? do you think this is correct?
Intermediate & Advanced SEO | | BobAnderson0 -
Is this page low quality?
Hey everyone, I need some help defining a post whether it is low quality or not. I got a post and it's a roundup post having 5 lists of fonts for free download. I actually linked to the sites from where anyone can download the font. The post is driving 300 visits a day but the bounce rate is too high around 90% and the time spent on the post is about 20 seconds on average (I checked it under GA Behaviour > Site Content > Landing pages). Also, I checked the traffic of those sites which I'm pointing in the roundup post and in their referral traffic my website is contributing. Does this mean that people clicking on the post from SERPs then quickly visiting the site to download the font as there are only 6 fonts featured in the post to download (due to six font they are not spending time)? Should I need to improve it or the page is answering query fast? Any thoughts are welcome.
Intermediate & Advanced SEO | | Bunnypundir0 -
Pagination on a product page with reviews spread out on multiple pages
Our current product pages markup only have the canonical URL on the first page (each page loads more user reviews). Since we don't want to increase load times, we don't currently have a canonical view all product page. Do we need to mark up each subsequent page with its own canonical URL? My understanding was that canonical and rel next prev tags are independent of each other. So that if we mark up the middle pages with a paginated URL, e.g: Product page #1http://www.example.co.uk/Product.aspx?p=2692"/>http://www.example.co.uk/Product.aspx?p=2692&pageid=2" />**Product page #2 **http://www.example.co.uk/Product.aspx?p=2692&pageid=2"/>http://www.example.co.uk/Product.aspx?p=2692" />http://www.example.co.uk/Product.aspx?p=2692&pageid=3" />Would mean that each canonical page would suggest to google another piece of unique content, which this obviously isn't. Is the PREV NEXT able to "override" the canonical and explain to Googlebot that its part of a series? Wouldn't the canonical then be redundant?Thanks
Intermediate & Advanced SEO | | Don340 -
Old page redirection method ?
New web site uploaded .but still there are many old site's pages index in Google .I have created 301 redirect for similar page but what about rest of pages?as eg there is a page called www.xxxx.com/testimonial.php but new site don't have a testimonial pages so what i can delete old page and redirect to home page or what please advice me
Intermediate & Advanced SEO | | innofidelity0 -
Blocking Pages Via Robots, Can Images On Those Pages Be Included In Image Search
Hi! I have pages within my forum where visitors can upload photos. When they upload photos they provide a simple statement about the photo but no real information about the image,definitely not enough for the page to be deemed worthy of being indexed. The industry however is one that really leans on images and having the images in Google Image search is important to us. The url structure is like such: domain.com/community/photos/~username~/picture111111.aspx I wish to block the whole folder from Googlebot to prevent these low quality pages from being added to Google's main SERP results. This would be something like this: User-agent: googlebot Disallow: /community/photos/ Can I disallow Googlebot specifically rather than just using User-agent: * which would then allow googlebot-image to pick up the photos? I plan on configuring a way to add meaningful alt attributes and image names to assist in visibility, but the actual act of blocking the pages and getting the images picked up... Is this possible? Thanks! Leona
Intermediate & Advanced SEO | | HD_Leona0 -
Ranking with other pages not index
The site ranks on page 4-5 with other page like privacy, about us, term pages. I encounter this problem allot in the last weeks; this usually occurs after the page sits 1-2 months on page 1 for the terms. I'm thinking of to much use the same anchor as a primary issue. The sites in questions are 1-5 pages microniche sites. Any suggestions is appreciated. Thank You
Intermediate & Advanced SEO | | m3fan0 -
On Page question
HI folks, I have a warning that I have missing meta tag descriptions on two pages. 1) http://bluetea.com.au/wp-login.php 2) http://bluetea.com.au/wp-login.php?action=lostpassword Is this something I should just ignore? Or is there a best practice I should be implementing? Thank you for your time
Intermediate & Advanced SEO | | PHDAustralia680