Page speed - what do you aim for?
-
Hi Mozzers - was just looking at website speed and know the google guidelines on average page load time but I'm not sure whether Google issues guidelines on any of the other 4? Do you know of any guidance on domain lookup, server response, server connection or page download?
Page Load Time (sec) - I tend to aim for 2 seconds max: http://www.hobo-web.co.uk/your-website-design-should-load-in-4-seconds/
Server Response Time: [Google recommends 200ms]: https://developers.google.com/speed/docs/insights/Server Redirection Time (sec) [dependent on number of redirects so probably no guide figure]
Domain Lookup Time (sec)
Server Connection Time (sec)
Page Download Time (sec)Thanks, Luke
-
IMHO, if somebody is paying us for SEO, then our GOAL is to get the homepage to load in a second or less.... especially if most of the users are mobile. If it's mid 1 second, then we can grudgingly live with that.
I'm glad you asked about server response times.... for most sites, after the content is optimized ( smaller images, clunky code, etc...) the initial server response time is usually the culprit for getting over a second.... as long as the rest of the home page is "light". Light to us is under 1MB. Depending on your CMS, there are a variety of ways to get the response time to be 200ms or less.
Google Pagespeed, as David said, is a good measurement, but it's not the holy grail of measurements. We use it only to identify areas that need improvement. Waterfalls tell us what's taking so long and what's heavy.
You didn't ask about plugins - which is a major culprit to caching, minify errors, conflicts, speed and weight. We limit all active plugins to TEN (including caching, SEO, security). For some sites, plugin clean up is the easiest way to speed up a site.
At the end of the day, nothing beats clean code, light images and a lightening fast server.
-
Thanks for all the feedback everyone - much appreciated, Luke
-
As long as the page loads quick for users then I wouldn't put a huge focus on this. True that Google looks at page load speed, but I wouldn't put all your eggs in that basket. We have sites that show a 2.5-3.5 second load time, and they still dominate ranking results. Focus on creating a better experience.
One of the simple ways to speed up load times is to minify and compress CSS and Javascript files as small as possible, but be sure to check that the minification does not break areas of the site. We have seen improvements as high as 75% just from completing this step alone.
If you have a Joomla or WordPress website, here is a great plugin that will do this for you: https://www.jch-optimize.net/
-
Hi Luke! When using this tool (https://developers.google.com/speed/pagespeed/insights/) we aim to have our clients above 80 for both mobile and desktop.
-
I will be honest, I don't trust Google with PageSpeed. There are too few questions asked about how it actually fits in with the metrics and what is used. One example is Google says resources like Google Analytics do not count against your score in the SERPS. But in the test they do. If you use several Google assets like Adwords, Analytics, fonts, ect; you will show a very low score. Using them will actually block you from seeing other things that you can fix.
What we have started doing is figuring out what Google actually needs and presenting it to them. We started hiding tracking codes from Google. Bing, Facebook, ect from them. We hide our analytics tracking script from Google's crawler. I figured out that Google's test servers have the FA library and also their font library locally on the machine, so we have started hiding those from it as well. Any 3rd party script we have that Google does not need to see has been hidden as well, segmentify, olark, anything really. Doing these things has raised our score quite a bit.
-
"if you have 2 or even 3 redirects mobile users wait for 5 seconds before see anything. Hint - that's why i won't click on most bit.ly, ow.ly, goo.gl links in Twitter, Facebook, G+ when i'm on mobile. Because they first pass via t.co redirect then redirect that i can see and sometime even 3rd redirect."
Just adding a bit of weight to what you said, here's a test of a t.co link through bit.ly: https://i.gyazo.com/ca87c486a903914c2b058612cc93f3f0.png on 3g, it's 4.27s to even start loading Google. Without t.co: https://i.gyazo.com/f22c18a0879f76ecf653662153e17c43.png which is 2.35s.
-
Pagespeed score means nothing unfortunately. http/2 puts a spanner in the works for a lot of it.
https://blog.newrelic.com/2016/02/09/http2-best-practices-web-performance/
Being this section:
- Concatenating JavaScript and CSS files: Combining smaller files into a larger file to reduce the total number of requests.
- **Image spriting: **Combining multiple small images into one larger image.
- Domain sharding: Spreading requests for static assets across several domains to increase the total number of open TCP connections allowed by the browser.
- Inlining assets: Bundling assets with the HTML document source, including base-64 encoding images or writing JavaScript code directly inside
-
It's hard to be explained but "Less is MORE!" in general for that numbers.
Examples - redirectors. Redirects can overkill your site specially on mobile users. For that even simply site redirect can took second or two. Example www.example.com -> 301 -> m.example.com; looks simple isn't? But in reality after client took 301 redirect he must make new domain resolving (for m.exmaple.com) and then new connect to new server (m.example.com). And this is simply case... if you have 2 or even 3 redirects mobile users wait for 5 seconds before see anything. Hint - that's why i won't click on most bit.ly, ow.ly, goo.gl links in Twitter, Facebook, G+ when i'm on mobile. Because they first pass via t.co redirect then redirect that i can see and sometime even 3rd redirect. I know that marketers want to see "clicks", but isn't good for mobile users.
Server connection is also need to be less. But this mean that server need to be closer to user. Best example is Australia. There even simply DNS resolving + connection took one second. And client doesn't receive single byte from server yet... You can see WebPageTest.org (there are Australian servers). But of course providing single server there is expensive, so you need to have deep pockets to make servers there. That's why most of companies providing CDN support. Since CDN endpoint is closer to user it make things little bit faster for them. And if CDN is setup correct should be much faster.
So - idea is "Less is More!". The best is if you use WPT to benchmark your site from all over the world. And also setup Analytics to count speed. Because it's different speed when your site is on perfect conditions in datacenter than in real world.
-
Hi Luke,
Here is what google recommends in terms of page speed. Server response time to be less than 200 ms.
Now, coming to the Page Speed tool / Insight that google provide the measure the page speed ratings (1-100) , Google Page Speed score is indeed a strong indicator of a website’s loading performance in terms of time.
As per my research, total website download less than 10 secs corresponds to 75-85 on pagespeed score.
I hope this helps.
Thanks,
Vijay
-
Thanks Tom for picking up on that error - ugh - corrected now. Brain working sluggishly this morning lol!
-
Hi Luke,
"Avg. Page Load Time (sec) [Google recommends 200ms]:" That's actually for the server response time.
Personally, the only thing that matters is that the overall page load time is quick. I aim if possible for sub 2 seconds for any page.
Tom
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Category Page as Shopping Aggregator Page
Hi, I have been reviewing the info from Google on structured data for products and started to ponder.
Intermediate & Advanced SEO | | Alexcox6
https://developers.google.com/search/docs/data-types/products Here is the scenario.
You have a Category Page and it lists 8 products, each products shows an image, price and review rating. As the individual products pages are already marked up they display Rich Snippets in the serps.
I wonder how do we get the rich snippets for the category page. Now Google suggest a markup for shopping aggregator pages that lists a single product, along with information about different sellers offering that product but nothing for categories. My ponder is this, Can we use the shopping aggregator markup for category pages to achieve the coveted rich results (from and to price, average reviews)? Keen to hear from anyone who has had any thoughts on the matter or had already tried this.0 -
How would you link build to this page?
Hi Guys, I'm looking to build links to a commercial page similar to this: https://apolloblinds.com.au/venetian-blinds/ How would you even create quality links (not against Google TOS) to a commercial page like that? Any ideas would be very much appreciated. Cheers.
Intermediate & Advanced SEO | | spyaccounts140 -
Swapping page keyword?
If we have swopped the keyword (leaflet printing) from this page http://www.fastprint.co.uk/leaflet-flyer-printing/ and moved it to http://www.fastprint.co.uk/ But the inner page is still ranking for the keyword is there a way to tell Google?
Intermediate & Advanced SEO | | BobAnderson0 -
Internal links to preferential pages
Hi all, I have question about internal linking and canonical tags. I'm working on an ecommerce website which has migrated platform (shopify to magento) and the website design has been updated to a whole new look. Due to the switch to magento, the developers have managed to change the internal linking structure to product pages. The old set up was that category pages (on urls domain.com/collections/brand-name) for each brand would link to products via the following url format: domain.com/products/product-name . This product url was the preferential version that duplicate product pages generated by shopify would have their canonical tags pointing to. This set up was working fine. Now what's happened is that the category pages have been changed to link to products via dynamically generated urls based on the user journey. So products are now linked to via the following urls: domain.com/collection/brand-name/product-name . These new product pages have canonical tags pointing back to the original preferential urls (domain.com/products/product-name). But this means that the preferential URLs for products are now NOT linked to anywhere on the website apart from within canonical tags and within the website's sitemap. I'm correct in thinking that this definitely isn't a good thing, right? I've actually noticed Google starting to index the non-preferential versions of the product pages in addition to the preferential versions, so it looks like Google perhaps is ignoring the canonical tags as there are so many internal links pointing to non-preferential pages, and no on-site links to the actual preferential pages? I've recommended to the developers that they change this back to how it was, where the preferential product pages (domain.com/products/product-name) were linked to from collection pages. I just would like clarification from the Moz community that this is the right call to make? Since the migration to the new website & platform we've seen a decrease in search traffic, despite all redirects being set up. So I feel that technical issues like this can't be doing the website any favours at all. If anyone could help out and let me know if what I suggested is correct then that would be excellent. Thank you!
Intermediate & Advanced SEO | | Guy_OTS0 -
Duplicate content within sections of a page but not full page duplicate content
Hi, I am working on a website redesign and the client offers several services and within those services some elements of the services crossover with one another. For example, they offer a service called Modelling and when you click onto that page several elements that build up that service are featured, so in this case 'mentoring'. Now mentoring is common to other services therefore will feature on other service pages. The page will feature a mixture of unique content to that service and small sections of duplicate content and I'm not sure how to treat this. One thing we have come up with is take the user through to a unique page to host all the content however some features do not warrant a page being created for this. Another idea is to have the feature pop up with inline content. Any thoughts/experience on this would be much appreciated.
Intermediate & Advanced SEO | | J_Sinclair0 -
Can too many "noindex" pages compared to "index" pages be a problem?
Hello, I have a question for you: our website virtualsheetmusic.com includes thousands of product pages, and due to Panda penalties in the past, we have no-indexed most of the product pages hoping in a sort of recovery (not yet seen though!). So, currently we have about 4,000 "index" page compared to about 80,000 "noindex" pages. Now, we plan to add additional 100,000 new product pages from a new publisher to offer our customers more music choice, and these new pages will still be marked as "noindex, follow". At the end of the integration process, we will end up having something like 180,000 "noindex, follow" pages compared to about 4,000 "index, follow" pages. Here is my question: can this huge discrepancy between 180,000 "noindex" pages and 4,000 "index" pages be a problem? Can this kind of scenario have or cause any negative effect on our current natural SEs profile? or is this something that doesn't actually matter? Any thoughts on this issue are very welcome. Thank you! Fabrizio
Intermediate & Advanced SEO | | fablau0 -
Duplicated Pages and Forums
Does duplicate content hurt that particular duplicated content, or the entire site? There are some parts of my site that I don’t care about getting high rankings on search engines. For example, I have a forum and there are certain links that only logged in people can see. If you aren’t logged in, they will take you to a page where it tells u to log in. google, obviously not logged in, interprets this as lots and lots of the same duplicated page. Should I just leave it alone cause I dont care if those pages makes it to search engines. Will it not hurt the entire site? For example, can my homepage search rankings decrase? That leads to my next question. What is the best way to optimize a forum? Whenever someone posts a new post, it seems another url for the same forum thread is created..... which is obviously duplicated….in other words, if like 20 people post on a thread, i believe my site adds 20 urls for that page...anyone know how to fix this?
Intermediate & Advanced SEO | | waltergah0 -
Too many on page links - product pages
Some of the pages on my client's website have too many on page links because they have lists of all their products. Is there anything I should/could do about this?
Intermediate & Advanced SEO | | AlightAnalytics0