Confused about PageSpeed Insights vs Site Load for SEO Benefit?
-
I was comparing sites with a friend of mine, and I have a higher PageSpeed Insights score for mobile and desktop than he does, but he his google analytics has his page load speed higher than. So assuming all things equal, some quality of conent, links, etc, is it better to have a site with a higher PageSpeed score or faster site load? To me, it makes more sense for it to be the latter, but if that's true, what's the point of the PageSpeed insights?
Thanks for your help! I appreciate it.
- Ruben
-
Thanks for insights!
- Ruben
-
Press F12 in your browser and use the network tab you will not only get your load time you will get it broken up so you can see where the problem is.
Bing did a study on load times, and every 10ms worked out to cost the average ecommerce site $200+ a year. What is an average ecommerce site, I am not sure, but it tells you something.
-
I would prefer better load times in Analytics. It samples the actual load time of the pages on your site, and is a good indication on how fast your users are seeing your content. You build your site for users, not for search engines. Normally, the faster your site loads, the better the user experience will be.
Apart from that, Analytics allows you to analyse which browsers, operating systems, etc. have the best/worst loading times, which helps you to prioritise the issues that should be solved.
Page speed insights is a great tool and will give you a lot of useful information on how you can optimise. It's is not a measure of how fast a page is loading. If you have 4 x 200KB images on your site, that are losslessly compressed - the tool will be quite happy to give you a good score on image optimisation, even if images of this size will take ages to load over a mobile connection. On the other hand, it can give you a low score for some render blocking javascript or css file, that in reality hardly has an effect on the user experience.
There is a 3rd tool I often use to measure pagespeed (webpagetest.org) - it also indicates areas of improvement and gives scores on each individual item, and it will also shows you the load time of each individual item on your page. Maybe most important feature: it allows you to see how fast the visible content is completely rendered on screen (which is in fact the most important measure for your visitors).
Hope this helps,
Dirk
-
Hi there
PageSpeed Insights is a snapshot of a page at the exact moment you ask it to crawl. Google Analytics Site Speed evaluates your entire domain (or group of pages or page, depending on what you want to look at) speed over periods. So, that would mean a day, week, month, year, so on.
I find both to have their value. One for a quick assessment and resources of a particular page, like PageSpeed, the other for a more holistic performance that takes my entire site into consideration - breaking down everything from redirect speed to server connection and download speeds - Google Analytics Site Speed.
I would consider GA Site Speed to be more valuable, but again, both hold their weight and have benefits depending on what you are looking at.
Hope this helps! Good luck!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Does personalization that changes meta data display in SERPs impact SEO?
My company has been rolling out personalization at the page level across our site using behavior paths embedding content from cross pathed pages as well as customer journey mapping. The dynamically generated content doesn’t change the URLs. In the SERPs I’m seeing that our title tags and meta descriptions also seem to be dynamically generated even though we have these elements crafted. The way our elements are crafted: Title tag: descriptive Keyword rich phrase | Brand Meta description: Keyword rich, grammatically correct description tied to title tag and page content for consistency. I search a specific URL: Title tag display: Keyword rich phrase | Brand – Brand Meta description display: Random content pulled from the page I search a phrase that includes Brand + keywords in the URL: Title tag display: Title tag we crafted Meta description display: Meta description we crafted I search a phrase that includes Brand + keywords in the title tag: Title tag display: Title tag we crafted Meta description display: Random content pulled from the page Does Google crawl the page and digest the title tag and meta description we crafted? Or is Google going to ding us for having the brand twice, exceeding the length of the title tag, etc.? I have been searching the interwebs, forums and the cosmos, but the only information I’m finding is related to the fact that URLs are changing and how that would impact SEO. That’s not the case for us. Thoughts on how all this is impacting our SEO efforts?
Algorithm Updates | | NStarJM0 -
Any red flags associated with this site?
Hey gang, My client's keywords have recently taken a header... We've owned the top 3 spots in the SERPs for several keyword phrases for several years. In the past 3 months we've watched all those keywords and local results fade... Examples of the types of terms we were consistently ranking for included things like: Indianapolis injury lawyer Indiana accident attorneys personal injury lawyers in Indianapolis semi-truck injury attorneys and several other similar keyword phrases. Was hoping someone would be kind enough to give me a second opinion about what the cause(s) may be. The site: http://www.2keller.com/ Love and peace to all of you! 🙂 Wayne
Algorithm Updates | | Wayne760 -
Pdfs for SEO - benefits, downfalls and promotional methods
Hi fellow Mozzers, We're just in the middle of relaunching our website (a design agency), and I had a few questions re: SEO of our service keywords. The designers want the site to seem light on content, despite my advice that this would reduce the terms we can rank for. With that in mind, I was going to include advice pages that can be found via the site map, site search or text links but aren't promoted via the top level or second level nav. Another alternative I was going to explore was using pdfs for design case studies, so the site would feature a light case study, but with a more in-depth pdf available if wanted. I have located numerous articles highlighting how best to optimise pdfs, but I have a few queries aside from the technical standpoint. So: is this the best way to getting round the issue of keeping the site 'light' on content? are there stats that show CTRs on pdf pages over HTML? as well as optimising the pdf content and promoting them on our social media channel, is there a benefit from including them on the likes of Scribd, Edocr and so on (from either an SEO or simply from a promotional viewpoint, or both) Hopefully that's all clear! Nick
Algorithm Updates | | themegroup0 -
Google Panda - large domain benefits
Hi, A bit of a general question, but has anyone noticed a improvement in rankings for large domains - ie well known, large sites such as Tesco, Amazon? From what I've seen, the latest Panda update seems to favour the larger sites, as opposed to smaller, niche sites. Just wondered if anyone else has noticed this too?Thanks
Algorithm Updates | | Digirank0 -
Local Vs National SEO Rankings
Hi Guys, I just had a quick question, is it truly possible to rank number one worldwide/nationally for a keyword phrase these days such as, Computer repair services. I'm not too concerned with the local serps that come up above the fold. I'm just more concerned, if Google is looking to serve more local results into the regular serps listing? I hope that makes sense thanks. Best, Peter
Algorithm Updates | | PeterRota0 -
ECommerce site being "filtered" by last Panda update, ideas and discussion
Hello fellow internet go'ers! Just as a disclaimer, I have been following a number of discussions, articles, posts, etc. trying to find a solution to this problem, but have yet to get anything conclusive. So I am reaching out to the community for help. Before I get into the questions I would like to provide some background: I help a team manage and improve a number of med-large eCommerce websites. Traffic ranges anywhere from 2K - 12K+ (per day) depending on the site. Back in March one of our larger sites was "filtered" from Google's search results. I say "filtered" because we didn't receive any warnings and our domain was/is still listed in the first search position. About 2-3 weeks later another site was "filtered", and then 1-2 weeks after that, a third site. We have around ten niche sites (in total), about seven of them share an identical code base (about an 80% match). This isn't that uncommon, since we use a CMS platform to manage all of our sites that holds hundreds of thousands of category and product pages. Needless to say, April was definitely a frantic month for us. Many meetings later, we attributed the "filter" to duplicate content that stems from our product data base and written content (shared across all of our sites). We decided we would use rel="canonical" to address the problem. Exactly 30 days from being filtered our first site bounced back (like it was never "filtered"), however, the other two sites remain "under the thumb" of Google. Now for some questions: Why would only 3 of our sites be affected by this "filter"/Panda if many of them share the same content? Is it a coincidence that it was an exact 30 day "filter"? Why has only one site recovered?
Algorithm Updates | | WEB-IRS1 -
Hyphens vs Underscores
I am optimizing a site which uses underscores rather than hyphens as word separators (such_as_this.php vs. such-as-this.php). Most of these pages have been around since 2007, and I am hesitant to just redirect to a new page because I am worried it will cause the rankings to slip. Would you recommend changing the file names to be in hyphenated format and place 301 redirects on the pages with underscores, or stick with the existing pages? Is there anything else that would work better? Thanks!
Algorithm Updates | | BluespaceCreative1 -
SEO Faith Shaker... help!!
Something has happened which is, well inexplicable to me... I'm stumped! We have a client that has two sites which compete for the same keywords. One is a .com, the other is a .co.uk. They have different content so there's no dupe worries. We have, for the past few months been carrying out SEO for the .com site. It's doing great. We don't do anything with the .co.uk site, which, incidentally dropped from 2nd (under the .com) to 9th after Panda for its main keyword. The owner of the site has switched the .co.uk to Wordpress and now that site, with the same content, same links, same social signals, etc... (nothing was done to it except the platform being changed) has suddenly shot up above the .com for not only its main keyword but most of the others too. What gives?? It doesn't even have a link from the .com site! So, the .com which has undergone SEO is now being beaten by the .co.uk which hasn't. The .com is still directly underneath it. It feels like all of the things we know about SEO, all of the ranking factors and everything are being totally undermined here, just due to a change to Wordpress. Surely that can't be it?? The .com is an older domain, has more content, has always done well, has more links and from better places, and all the social stuff surrounding the business is targeted at it. This isn't a penalization issue or anything like that, this is simply a matter of the .co.uk suddenly blasting above everything for no apparent reason. Any ideas?? I know that there "might" be a tiny, tiny, tiny advantage of the country TLD but that's not enough to do this, and the .co.uk always did worse before.
Algorithm Updates | | SteveOllington1