Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
How important is Lighthouse page speed measurement?
-
Hi,
Many experts cite the Lighthouse speed as an important factor for search ranking. It's confusing because several top sites have Lighthouse speed of 30-40, yet they rank well. Also, some sites that load quickly have a low Lighthouse speed score (when I test on mobile/desktop they load much quicker than stated by Lighthouse).
When we look at other image rich sites (such as Airbnb, John Deere etc) the Lighthouse score can be 30-40.
Our site https://www.equipmentradar.com/ loads quickly on Desktop and Mobile, but the Lighthouse score is similar to Airbnb and so forth. We have many photos similar to photo below, probably 30-40, many of which load async.
Should we spend more time optimizing Lighthouse or is it ok? Are large images fine to load async?
Thank you,
Dave
-
It's absolutely essential that your company website is fast.
Don't purchase slow, cheap web hosting, regardless of your business type.
Instead purchase super fast hosting for your business.
Sometimes, it's much more expensive, but it's well worth it as it can help improve your organic SEO.
We purchased lightning-fast hosting; this is the one reason why we are now selling more bath garden offices than ever before before.
-
it is important to distinguish between PageSpeed Insights and Lighthouse. Maybe it's more important to follow PageSpeed Insights for your website. It becomes rather clear after reading this article https://rush-analytics.com/blog/google-pagespeed-insights-vs-lighthouse-how-do-they-differ. The differences between PageSpeed Insights and Lighthouse are explained in an easy way.
-
My understanding is that "Page Experience" signals (including the new "core web vitals) will be combined with existing signals like mobile friendliness and https-security in May, 2021. This is according to announcements by Google.
https://developers.google.com/search/blog/2020/05/evaluating-page-experience
https://developers.google.com/search/blog/2020/11/timing-for-page-experience
So, these will be search signlas, but there are lots of other very important search signals which can outweigh these. Even if a page on John Deere doesn't pass the Core Web Vitals criteria, it is still likely to rank highly for "garden tractors".
If you are looking at Lighthouse, I would point out a few things:
- The Lighthouse audits on your own local machine are going to differ from those run on hosted servers like Page Speed Insights. And those will differ from "field data" from the Chrome UX Report
- In the end, it's the "field data" that will be used for the Page Experience validation, according to Google. But, lab-based tools are very helpful to get immediate feedback, rather than waiting 28 days or more for field data.
- If your concern is solely about the impact on search rankings, then it makes sense to pay attention specifically to the 3 scores being considered as part of CWV (CLS, FID, LCP)
- But also realize that while you are improving scores for criteria which will be validated for search signals, you're also likely improving the user experience. Taking CLS as an example, for sure users are frustrated when they attempt to click a button and end up clicking something else instead because of a layout shift. And frustrated users generally equals lower conversion rates. So, by focusing on improvements in measures like these (I do realize your question about large images doesn't necessarily pertain specifically to CLS), you are optimizing both for search ranking and for conversions.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Solved How to solve orphan pages on a job board
Working on a website that has a job board, and over 4000 active job ads. All of these ads are listed on a single "job board" page, and don’t obviously all load at the same time. They are not linked to from anywhere else, so all tools are listing all of these job ad pages as orphans. How much of a red flag are these orphan pages? Do sites like Indeed have this same issue? Their job ads are completely dynamic, how are these pages then indexed? We use Google’s Search API to handle any expired jobs, so they are not the issue. It’s the active, but orphaned pages we are looking to solve. The site is hosted on WordPress. What is the best way to solve this issue? Just create a job category page and link to each individual job ad from there? Any simpler and perhaps more obvious solutions? What does the website structure need to be like for the problem to be solved? Would appreciate any advice you can share!
Reporting & Analytics | | Michael_M2 -
Fire a tag when element is loaded on page (Google Tag Manager)
I'm using an Element Visibility trigger to track a value that appears on a page. However, I want to track this value even when the user doesn't scroll to the area of the page where the element is (i.e. when the page is loaded, and the value is displayed below the fold, but the user doesn't scroll down there). Is there a way of doing this
Reporting & Analytics | | RWesley0 -
Will noindex pages still get link equity?
We think we get link equity from some large travel domains to white label versions of our main website. These pages are noindex because they're the same URLs and content as our main B2C website and have canonicals to the pages we want indexed. Question is, is there REALLY link equity to pages on our domain which have "noindex,nofollow" on them? Secondly we're looking to put all these white label pages on a separate structure, to better protect our main indexed pages from duplicate content risks. The best bet would be to put them on a sub folder rather than a subdomain, yes? That way, even though the pages are still noindex, we'd get link equity from these big domains to www.ourdomain.com/subfolder where we wouldn't to subdomain.ourdomain.com? Thank you!
Reporting & Analytics | | HTXSEO0 -
Why would page views per visitor suddenly increase?
My website traffic is growing by about 1% a week. It has a fairly stable page views/visitor of about 1.69. There's normally very little variability in this As we sell an industrial product. Today page views jumped by 50% and so did page views/visitor but visitor numbers stayed the same. I dont have a useful hypothesis to explain this. Analytics shows me that the traffic source, country of origin and pages viewed are pretty much the same as normal. There's been no substantive change to the site (today we changed the text in a widget to link to a new page - and no one visited it). It doesn't look like 1 person has gone through the whole site as that would skew the distribution of page views by country So why would user behavour suddenly change? I'll look at it for the rest of the week but in 7 years of looking after this website I haven't seen anything like this before.
Reporting & Analytics | | Zippy-Bungle0 -
Find Pages with 0 traffic
Hi, We are trying to consolidate the amount of landing pages on our site, is there any way to find landing pages with a particular URL substring which have had 0 traffic? The minimum which appears in google analytics is 1 visit.
Reporting & Analytics | | driveawayholidays0 -
How can I see what Google sees when it crawls my page?
In other words, how can see the text and what not it sees from start to finish on each page. I know there was a site, but I can't remember it.
Reporting & Analytics | | tiffany11030 -
Sudden Increase In Number of Pages Indexed By Google Webmaster When No New Pages Added
Greetings MOZ Community: On June 14th Google Webmaster tools indicated an increase in the number of indexed pages, going from 676 to 851 pages. New pages had been added to the domain in the previous month. The number of pages blocked by robots increased at that time from 332 (June 1st) to 551 June 22nd), yet the number of indexed pages still increased to 851. The following changes occurred between June 5th and June 15th: -A new redesigned version of the site was launched on June 4th, with some links to social media and blog removed on some pages, but with no new URLs added. The design platform was and is Wordpress. -Google GTM code was added to the site. -An exception was made by our hosting company to ModSecurity on our server (for i-frames) to allow GTM to function. In the last ten days my web traffic has decline about 15%, however the quality of traffic has declined enormously and the number of new inquiries we get is off by around 65%. Click through rates have declined from about 2.55 pages to about 2 pages. Obviously this is not a good situation. My SEO provider, a reputable firm endorsed by MOZ, believes the extra 175 pages indexed by Google, pages that do not offer much content, may be causing the ranking decline. My developer is examining the issue. They think there may be some tie in with the installation of GTM. They are noticing an additional issue, the sites Contact Us form will not work if the GTM script is enabled. They find it curious that both issues occurred around the same time. Our domain is www.nyc-officespace-leader. Does anyone have any idea why these extra pages are appearing and how they can be removed? Anyone have experience with GTM causing issues with this? Thanks everyone!!!
Reporting & Analytics | | Kingalan1
Alan1 -
Does analytics track an order two times by refresh on the confirmation-page?
Hi there,
Reporting & Analytics | | Webdannmark
I have a quick question. Does Google analytics track an order two times, if the user buys a product, see the confirmation page and then click refresh/click or back and forward again?
The order/tracking data must be the same, but i guess the tracking code runs for every refresh and therefore tracks the order two times in Analytics or does analytics know that it is the same order? Someone that can clearify this?Thanks! Regards
Kasper0