Confused about PageSpeed Insights vs Site Load for SEO Benefit?
-
I was comparing sites with a friend of mine, and I have a higher PageSpeed Insights score for mobile and desktop than he does, but he his google analytics has his page load speed higher than. So assuming all things equal, some quality of conent, links, etc, is it better to have a site with a higher PageSpeed score or faster site load? To me, it makes more sense for it to be the latter, but if that's true, what's the point of the PageSpeed insights?
Thanks for your help! I appreciate it.
- Ruben
-
Thanks for insights!
- Ruben
-
Press F12 in your browser and use the network tab you will not only get your load time you will get it broken up so you can see where the problem is.
Bing did a study on load times, and every 10ms worked out to cost the average ecommerce site $200+ a year. What is an average ecommerce site, I am not sure, but it tells you something.
-
I would prefer better load times in Analytics. It samples the actual load time of the pages on your site, and is a good indication on how fast your users are seeing your content. You build your site for users, not for search engines. Normally, the faster your site loads, the better the user experience will be.
Apart from that, Analytics allows you to analyse which browsers, operating systems, etc. have the best/worst loading times, which helps you to prioritise the issues that should be solved.
Page speed insights is a great tool and will give you a lot of useful information on how you can optimise. It's is not a measure of how fast a page is loading. If you have 4 x 200KB images on your site, that are losslessly compressed - the tool will be quite happy to give you a good score on image optimisation, even if images of this size will take ages to load over a mobile connection. On the other hand, it can give you a low score for some render blocking javascript or css file, that in reality hardly has an effect on the user experience.
There is a 3rd tool I often use to measure pagespeed (webpagetest.org) - it also indicates areas of improvement and gives scores on each individual item, and it will also shows you the load time of each individual item on your page. Maybe most important feature: it allows you to see how fast the visible content is completely rendered on screen (which is in fact the most important measure for your visitors).
Hope this helps,
Dirk
-
Hi there
PageSpeed Insights is a snapshot of a page at the exact moment you ask it to crawl. Google Analytics Site Speed evaluates your entire domain (or group of pages or page, depending on what you want to look at) speed over periods. So, that would mean a day, week, month, year, so on.
I find both to have their value. One for a quick assessment and resources of a particular page, like PageSpeed, the other for a more holistic performance that takes my entire site into consideration - breaking down everything from redirect speed to server connection and download speeds - Google Analytics Site Speed.
I would consider GA Site Speed to be more valuable, but again, both hold their weight and have benefits depending on what you are looking at.
Hope this helps! Good luck!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Http vs Https Related Rankings Drop?
I've noticed in a number of keyword ranking tools (Moz included) that our rankings have dropped substantially for a number of our top performing keywords precisely 7 days back. When you view the attached screenshot you'll see there was a drastic drop in the overall organic impressions as well as a drop in keyword rankings. I also noticed that all the keywords which have dropped in rank now show with the https version of our home page url. I've read up on this and it believe that this should not cause a drop in rankings but we have even added https as a domain in webmaster tools with no improvement. Quite simply, has Google de-indexed our http home page url which was previously tied to our higher rankings for our core keywords? How can we get this back without "disavowing" our https version of the site. We're not doing anything to game search results so I dont think we're being penalized, simply there is some sort technical glitch taking place between recognizing HTTP vs HTTPS versions of our site. Our home page is goo.gl/qVPRwf and an example keyword is "wedding ring sets his and hers" Can anyone recommend further debugging steps or have an understanding of what can be done at this point? Also, if it helps, I have studied the Help Center, read the FAQs and searched for similar questions with no success.wedding ring sets his and hers impressions%20-%20ranking%20drop.png?dl=0
Algorithm Updates | | punitshah0 -
Condensing content for web site redesign
We're working on a redesign and are wondering if we should condense some of the content (as recommended by an agency), and if so, how that will affect our organic efforts. Currently a few topics have individual pages for each section, such as (1) Overview (2) Symptoms and (3) Treatment. For reference, the site has a similar structure to http://www.webmd.com/heart-disease/guide/heart-disease-overview-fact. Our agency has sent us over mock-ups which show these topics being condensed into one and using a script/AJAX to display only the content that is clicked on. Knowing this, if we were to choose this option, that would result in us having to implement redirects because only one page would exist, instead of all three. Can anyone provide insight into whether we should keep the topic structure as is, or if we should take the agency's advice and merge all the topic content? *Note: The reason the agency is pushing for the merging option is because they say it helps with page load time. Thank you in advance for any insight! Tcd5Wo1.jpg
Algorithm Updates | | ATShock1 -
What is your experience with markups (schema.org) in terms of SEO and best practice learnings?
Hi, I am looking to implement schema markups into a variety of websites and currently wondering about best practices. I am working on energy providers, building material, e-retailers, social association among others. While I understand every single one of these is an individual case, I could do with some advices from you, guys. Which markups would you consider key for search engines? I would have naturally chosen markups to highlight the business name, location and products but there is so much more to schema.org! Thanks,
Algorithm Updates | | A_Q0 -
Dumb International SEO question?
Buongiorno from 18 degrees C Wetherby UK... Client asks - "My swedish site is http://www2.kingspanpanels.se/ how important is having the swedish suffix in the url with regards to rankings in Sweden?" I find these questions really challenging, its like the Hey if i change this url my SEO problems will be fixed, as if its that easy. So my question is - "How weighted is the url suffix / ccTLD in terms of SEO success for a territory / country" Put another way "If the swedish suffix .se was removed would it impact rankings in any way in Sweden?" Grazie tanto,
Algorithm Updates | | Nightwing
David0 -
Bing's indexed pages vs pages appearing in results
Hi all We're trying to increase our efforts in ranking for our keywords on Bing, and I'm discovering a few unexpected challenges. Namely, Bing is reporting 16000+ pages have been crawled... yet a site:mywebsite.com search on Bing shows less than 1000 results. I'm aware that Duane Forrester has said they don't want to show everything, only the best. If that's the case, what factors must we consider most to encourage Bing's engine to display most if not all of the pages the crawl on my site? I have a few ideas of what may be turning Bing off so to speak (some duplicate content issues, 301 redirects due to URL structure updates), but if there's something in particular we should monitor and/or check, please let us know. We'd like to prioritize 🙂 Thanks!
Algorithm Updates | | brandonRT0 -
What is the point of XML site maps?
Given how Google uses Page Rank to pass link juice from one page to the next if Google can only find a page in an XML site map it will have no link juice and appear very low in search results if at all. The priority in XML sitemaps field also seems pretty much irrelevant to me. Google determines the priority of a page based on the number of inbound links to it. If your site is designed properly the most important pages will have the most links. The changefreq field could maybe be useful if you have existing pages that are updated regularly. Though it seems to me Google tends to crawl sites often enough that it isn't useful. Plus for most of the web the significant content of an existing page doesn't change regularly, instead new pages are added with new content. This leaves the lastmod field as being potentially useful. If Google starts each crawl of your site by grabbing the sitemap and then crawls the pages whose lastmod date is newer than its last crawl of the site their crawling could be much more efficient. The site map would not need to contain every single page of the site, just the ones that have changed recently. From what I've seen most site map generation tools don't do a great job with the fields other than loc. If Google can't trust the priority, changefreq, or lastmod fields they won't put any weight on them. It seems to me the best way to rank well in Google is by making a good, content-rich site that is easily navigable by real people (and that's just the way Google wants it). So, what's the point of XML site maps? Does the benefit (if any) outweigh the cost of developing and maintaining them?
Algorithm Updates | | pasware0 -
Big site SEO: To maintain html sitemaps, or scrap them in the era of xml?
We have dynamically updated xml sitemaps which we feed to Google et al. Our xml sitemap is updated constantly, and takes minimal hands on management to maintain. However we still have an html version (which we link to from our homepage), a legacy from back in the pre-xml days. As this html version is static we're finding it contains a lot of broken links and is not of much use to anyone. So my question is this - does Google (or any other search engine) still need both, or are xml sitemaps enough?
Algorithm Updates | | linklater0