How fast should a page load to get a Green light at Googles PageSpeed?
-
So, trying to get e big e-commerce site to work on their page loading issuses. Their question left me without an answer, so how fast should a site be, so that it will get a Green light at the Googles Page Speed test?
Is there a number in seconds? Do we know that?
-
To be clear - Pagespeed Insights does not measure the speed of a page, It's entirely possible to have a score of 90 and a load time of (a disastrous) 29 seconds. I have screenshots to prove it
All Pagespeed does is check for the typical server/software configurations that "usually" lead to faster pages, as Linda mentions.
All you should care about is what your VISITORS experience and what they think is "fast enough". You need to put RUM (Real User Monitoring) on the site's pages so you can directly correlate visitor behaviour/conversions to page speed. (And so you'll actually know what speed real users experience, as opposed to the totally synthetic speed tests like Pagespeed Insights or even webpagetest.org/gtmetrix etc.)
If the site uses Google Analytics, this RUM is built in, but you must adjust the tracking code snippet to get worthwhile value from it. By default, Analytics will only track 1% of pageviews' speed. Adjusting the tracking snippet will allow tracking of up to 100% of pageviews or 10,000 pageviews per day.
You'll have SERIOUS power in your hands when you can see the actual speed performance of all pages that takes into account REAL user variables like connection speed, location, browser, mobile vs. desktop, time of day/server load etc, etc. Don't guess - use data.
Hope that helps?
Paul
P.S. If the site does have really high-volume traffic, you will already have at least a bit of data in the Site Speed report in GA at teh defualt 1% You can use it as a baseline to prompt action and to measure improvements, but you want to get up to that 10,000 pageviews tracked per day as soon as possible.
-
Here are the details about PageSpeed insights. A score of 85 or better will get you the green bar.
Note that this is based on network-independent aspects of page performance like server configuration, HTML structure of a page and its use of external resources such as images, JavaScript, and CSS and is not a direct "speed test." [The actual number of seconds will vary based on connection.]
-
I'd say that 5kb is definately worth it! Even on bandwidth alone, if you get that image downloaded 10 times that's 50kb in bandwidth saved.
-
Obviously the answer has already been giving here but I just wanted to get in and say that most of the recommendations that are usually in there are sometimes indeed not effective enough to pick up. If you still have to save 5kb on image load to make sure people have to download less files it's probably not worth to pick it up. However it provides awesome guidelines in general to make sure that redesigns or new sites that you're building will follow the guidelines completely.
-
I don't believe there is a 'magic number'. As fast as you can get it...
It also depends on what the competition does if it will help you at all to improve your score.
Then again, I've read over and over that the PageSpeed score sometimes is something no to worry too much about, as some of it suggestions for improvement are simply undoable to have the site working properly.
But it'll be hard to convince your client to become 'unbelievers' of a grade by a major brand…!
-
The answer shouldn't be "what does it take to get a green light" it should be "what's the quickest we can get it for our user". The ideal speed should be under two seconds to load a page.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Pages Crawl Per Day Gone Drasitcaly Down, is it google issue?
Hello Expert, In search console in Crawl Stats Pages Crawl per day going day by day i.e. from 4 lac pages per day now it is reduce upto 2 lac in last 15 days. So where is the issue? Where I am going wrong or it is issue from google end? Thanks!
Technical SEO | | Johny123450 -
Is site: a reliable method for getting full list of indexed pages?
The site:domain.com search seems to show less pages than it used to (Google and Bing). It doesn't relate to a specific site but all sites. For example, I will get "page 1 of about 3,000 results" but by the time I've paged through the results it will end and change to "page 24 of 201 results". In that example If I look in GSC it shows 1,932 indexed. Should I now accept the "pages" listed in site: is an unreliable metric?
Technical SEO | | bjalc20112 -
Should I remove these pages from the Google index?
Hi there, Please have a look at the following URL http://www.elefant-tours.com/index.php?callback=imagerotator&gid=65&483. It's a "sitemap" generated by a Wordpress plug-in called NextGen gallery and it maps all the images that have been added to the site through this plugin, which is quite a lot in this case. I can see that these "sitemap" pages have been indexed by Google and I'm wondering whether I should remove these or not? In my opinion these are pages that a search engine would never would want to serve as a search result and pages that a visitor never would want to see. Attracting any traffic through Google images is irrelevant in this case. What is your advice? Block it or leave it indexed or something else?
Technical SEO | | Robbern0 -
Is Google suppressing a page from results - if so why?
UPDATE: It seems the issue was that pages were accessible via multiple URLs (i.e. with and without trailing slash, with and without .aspx extension). Once this issue was resolved, pages started ranking again. Our website used to rank well for a keyword (top 5), though this was over a year ago now. Since then the page no longer ranks at all, but sub pages of that page rank around 40th-60th. I searched for our site and the term on Google (i.e. 'Keyword site:MySite.com') and increased the number of results to 100, again the page isn't in the results. However when I just search for our site (site:MySite.com) then the page is there, appearing higher up the results than the sub pages. I thought this may be down to keyword stuffing; there were around 20-30 instances of the keyword on the page, however roughly the same quantity of keywords were on each sub pages as well. I've now removed some of the excess keywords from all sections as it was getting in the way of usability as well, but I just wanted some thoughts on whether this is a likely cause or if there is something else I should be worried about.
Technical SEO | | Datel1 -
Pages removed from Google index?
Hi All, I had around 2,300 pages in the google index until a week ago. The index removed a load and left me with 152 submitted, 152 indexed? I have just re-submitted my sitemap and will wait to see what happens. Any idea why it has done this? I have seen a drop in my rankings since. Thanks
Technical SEO | | TomLondon0 -
Can Google show the hReview-Aggregate microformat in the SERPs on a product page if the reviews themselves are on a separate page?
Hi, We recently changed our eCommerce site structure a bit and separated our product reviews onto a a different page. There were a couple of reasons we did this : We used pagination on the product page which meant we got duplicate content warnings. We didn't want to show all the reviews on the product page because this was bad for UX (and diluted our keywords). We thought having a single page was better than paginated content, or at least safer for indexing. We found that Googlebot quite often got stuck in loops and we didn't want to bury the reviews way down in the site structure. We wanted to reduce our bounce rate a little, so having a different reviews page could help with this. In the process of doing this we tidied up our microformats a bit too. The product page used to have to three main microformats; hProduct hReview-Aggregate hReview The product page now only has hProduct and hReview-Aggregate (which is now nested inside the hProduct). This means the reviews page has hReview-Aggregate and hReviews for each review itself. We've taken care to make sure that we're specifying that it's a product review and the URL of that product. However, we've noticed over the past few weeks that Google has stopped feeding the reviews into the SERPs for product pages, and is instead only feeding them in for the reviews pages. Is there any way to separate the reviews out and get Google to use the Microformats for both pages? Would using microdata be a better way to implement this? Thanks,
Technical SEO | | OptiBacUK
James0 -
Do pages that are in Googles supplemental index pass link juice?
I was just wondering if a page has been booted into the supplemental index for being a duplicate for example (or for any other reason), does this page pass link juice or not?
Technical SEO | | FishEyeSEO0 -
Why do I get duplicate page title errors.
I keep getting duplicate page title errors on www.etraxc.com/ and www.etraxc.com/default.asp, which are both pointing to the same page. How do i resolve this and how bad is it hurting my SEO.
Technical SEO | | bobbabuoy0