How fast should a page load to get a Green light at Googles PageSpeed?
-
So, trying to get e big e-commerce site to work on their page loading issuses. Their question left me without an answer, so how fast should a site be, so that it will get a Green light at the Googles Page Speed test?
Is there a number in seconds? Do we know that?
-
To be clear - Pagespeed Insights does not measure the speed of a page, It's entirely possible to have a score of 90 and a load time of (a disastrous) 29 seconds. I have screenshots to prove it
All Pagespeed does is check for the typical server/software configurations that "usually" lead to faster pages, as Linda mentions.
All you should care about is what your VISITORS experience and what they think is "fast enough". You need to put RUM (Real User Monitoring) on the site's pages so you can directly correlate visitor behaviour/conversions to page speed. (And so you'll actually know what speed real users experience, as opposed to the totally synthetic speed tests like Pagespeed Insights or even webpagetest.org/gtmetrix etc.)
If the site uses Google Analytics, this RUM is built in, but you must adjust the tracking code snippet to get worthwhile value from it. By default, Analytics will only track 1% of pageviews' speed. Adjusting the tracking snippet will allow tracking of up to 100% of pageviews or 10,000 pageviews per day.
You'll have SERIOUS power in your hands when you can see the actual speed performance of all pages that takes into account REAL user variables like connection speed, location, browser, mobile vs. desktop, time of day/server load etc, etc. Don't guess - use data.
Hope that helps?
Paul
P.S. If the site does have really high-volume traffic, you will already have at least a bit of data in the Site Speed report in GA at teh defualt 1% You can use it as a baseline to prompt action and to measure improvements, but you want to get up to that 10,000 pageviews tracked per day as soon as possible.
-
Here are the details about PageSpeed insights. A score of 85 or better will get you the green bar.
Note that this is based on network-independent aspects of page performance like server configuration, HTML structure of a page and its use of external resources such as images, JavaScript, and CSS and is not a direct "speed test." [The actual number of seconds will vary based on connection.]
-
I'd say that 5kb is definately worth it! Even on bandwidth alone, if you get that image downloaded 10 times that's 50kb in bandwidth saved.
-
Obviously the answer has already been giving here but I just wanted to get in and say that most of the recommendations that are usually in there are sometimes indeed not effective enough to pick up. If you still have to save 5kb on image load to make sure people have to download less files it's probably not worth to pick it up. However it provides awesome guidelines in general to make sure that redesigns or new sites that you're building will follow the guidelines completely.
-
I don't believe there is a 'magic number'. As fast as you can get it...
It also depends on what the competition does if it will help you at all to improve your score.
Then again, I've read over and over that the PageSpeed score sometimes is something no to worry too much about, as some of it suggestions for improvement are simply undoable to have the site working properly.
But it'll be hard to convince your client to become 'unbelievers' of a grade by a major brand…!
-
The answer shouldn't be "what does it take to get a green light" it should be "what's the quickest we can get it for our user". The ideal speed should be under two seconds to load a page.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Why is my inner pages ranking higher than main page?
Hi everyone, for some reason lately i have discovered that Google is ranking my inner pages higher than the main subfolder page. www.domain.com/subfolder --> Target page to be ranked
Technical SEO | | davidboh
www.domain.com/subfolder/aboutus ---> page that is currently ranking Also in the SERP most of the time, it is showing both links in this manner. www.domain.com/subfolder/aboutus
-----------www.domain.com/subfolder Thanks in advance.1 -
Why google does not remove my page?
Hi everyone, last week i add "Noindex" tag into my page, but that site still appear in the organic search. what other things i can do for remove from google?
Technical SEO | | Jorge_HDI0 -
Website SEO Product Pages - Condense Product Pages
We are managing a website that has seen consistently dropping rankings over the last 2 years (http://www.independence-bunting.com/). Our long term strategy has been purely content-based and is of high quality, but isn’t seeing the desired results. It is an ecommerce site that has a lot of pages, most of which are category or product pages. Many of the product pages have duplicate or thin content, which we currently see as one of the primary reasons for the ranking drops.The website has many individual products which have the same fabric and size options, but have different designs. So it is difficult to write valuable content that differs between several products that have similar designs. Right now each of the different designs has its own product page. We have a dilemma, because our options are:A.Combine similar designs of the product into one product page where the customer must choose a design, a fabric, and a size before checking out. This way we can have valuable content and don’t have to duplicate that content on other pages or try to find more to say about something that there really isn’t anything else to say about. However, this process will remove between 50% and 70% of the pages on the website. We know number of indexed pages is important to search engines and if they suddenly see that half of our pages are gone, we may cause more negative effects despite the fact that we are in fact aiming to provide more value to the user, rather than less.B.Leave the product pages alone and try to write more valuable content for each product page, which will be difficult because there really isn’t that much more to say, or more valuable ways to say it. This is the “safe” option as it means that our negative potential impact is reduced but we won’t necessarily see much positive trending either. C.Test solution A on a small percentage of the product categories to see any impact over the next several months before making sitewide updates to the product pages if we see positive impact, or revert to the old way if we see negative impact.Any sound advice would be of incredible value at this point, as the work we are doing isn’t having the desired effects and we are seeing consistent dropping rankings at this point.Any information would be greatly appreciated. Thank you,
Technical SEO | | Ed-iOVA0 -
Why Google ranks a page with Meta Robots: NO INDEX, NO FOLLOW?
Hi guys, I was playing with the new OSE when I found out a weird thing: if you Google "performing arts school london" you will see w w w . mountview . org. uk at the 3rd position. The point is that page has "Meta Robots: NO INDEX, NO FOLLOW", why Google indexed it? Here you can see the robots.txt allows Google to index the URL but not the content, in article they also say the meta robots tag will properly avoid Google from indexing the URL either. Apparently, in my case that page is the only one has the tag "NO INDEX, NO FOLLOW", but it's the home page. so I said to myself: OK, perhaps they have just changed that tag therefore Google needs time to re-crawl that page and de-index following the no index tag. How long do you think it will take to don't see that page indexed? Do you think it will effect the whole website, as I suppose if you have that tag on your home page (the root domain) you will lose a lot of links' juice - it's totally unnatural a backlinks profile without links to a root domain? Cheers, Pierpaolo
Technical SEO | | madcow780 -
Google dropping pages from SERPs even though indexed and cached. (Shift over to https suspected.)
Anybody know why pages that have previously been indexed - and that are still present in Google's cache - are now not appearing in Google SERPs? All the usual suspects - noindex, robots, duplication filter, 301s - have been ruled out. We shifted our site over from http to https last week and it appears to have started then, although we have also been playing around with our navigation structure a bit too. Here are a few examples... Example 1: Live URL: https://www.normanrecords.com/records/149002-memory-drawings-there-is-no-perfect-place Cached copy: http://webcache.googleusercontent.com/search?q=cache:https://www.normanrecords.com/records/149002-memory-drawings-there-is-no-perfect-place SERP (1): https://www.google.co.uk/search?q=memory+drawings+there+is+no+perfect+place SERP (2): https://www.google.co.uk/search?q=memory+drawings+there+is+no+perfect+place+site%3Awww.normanrecords.com Example 2: SERP: https://www.google.co.uk/search?q=deaf+center+recount+site%3Awww.normanrecords.com Live URL: https://www.normanrecords.com/records/149001-deaf-center-recount- Cached copy: http://webcache.googleusercontent.com/search?q=cache:https://www.normanrecords.com/records/149001-deaf-center-recount- These are pages that have been linked to from our homepage (Moz PA of 68) prominently for days, are present and correct in our sitemap (https://www.normanrecords.com/catalogue_sitemap.xml), have unique content, have decent on-page optimisation, etc. etc. We moved over to https on 11 Aug. There were some initial wobbles (e.g. 301s from normanrecords.com to www.normanrecords.com got caught up in a nasty loop due to the conflicting 301 from http to https) but these were quickly sorted (i.e. spotted and resolved within minutes). There have been some other changes made to the structure of the site (e.g. a reduction in the navigation options) but nothing I know of that would cause pages to drop like this. For the first example (Memory Drawings) we were ranking on the first page right up until this morning and have been receiving Google traffic for it ever since it was added to the site on 4 Aug. Any help very much appreciated! At the very end of my tether / understanding here... Cheers, Nathon
Technical SEO | | nathonraine0 -
Getting mixed signals regarding how Google treats subdomains
All the posts I've read here and elsewhere regarding subdomains come to a similar conclusion, avoid using them because they are treated as a separate site -- and everything that goes along with that. But on my site we have a subdomain on a separate server and it's treated as internal. Also this from Hubspot - "**Use a subdomain of your website like Blog.HubSpot.com. **This is a great idea and this is what we do currently at HubSpot. Many companies have their blog on a subdomain, and it seems to be starting to be somewhat of a standard. The search engines are treating subdomains more and more as just portions of the main website, so the SEO value for your blog is going to add to your main website domain." Any help clarifying this would be greatly appreciated!
Technical SEO | | titleist1 -
Why Google not picking My META Description? Google itself populate the description.. How to control this Search Snippets??
Why Google not picking My META Description? Google itself populate the description.. How to control this Search Snippets??
Technical SEO | | greyniumseo0 -
Top pages give " page not found"
A lot of my top pages point to images in a gallery on my site. When I click on the url under the name of the jpg file I get an error page not found. For instance this link: http://www.fastingfotografie.nl/architectuur-landschap/single-gallery/10162327 Is this a problem? Thanks. Thomas. JkLej.png
Technical SEO | | thomasfasting0