Web Core Vitals and Page Speed Insights Not Matching Scores
-
We have some URLs that are being flagged as poor inside Search Console in the Web Core Vitals report. For example, the report is saying that some pages have too many CLS issues.
When looking into things we can do to update we have noticed that when we run the same page using the PageSpeed Insights tool we are not getting the same bad scores. This is making it hard for us to actually know what needs to be addressed. Nor can we tell if a change actually fixed the issue because in PageSpeed Insights there is not an issue. Has anyone else had similar issues. If so have you found a way to fix it?
-
To my understanding, GSC is reporting based on "field data" (meaning the aggregate score of visitors to a specific page over a 28 day period). When you run Page Speed Insights, you can see both Field Data and "lab data". The lab data is your specific run. There are quite a few reasons why field data and lab data may not match. One reason is that changes have been made to the page, which are reflected in the lab data, but will not be reflected in the field data until the next month's set is available. Another reason is that the lab device doesn't run at the exact same specs as the real users in the field data.
The way I look at it is that I use the lab data (and I screen print my results over time, or use other Lighthouse-based tools like GTMetrix, with an account) to assess incremental changes. But the goal is to eventually get the field data (representative of the actual visitors) improved, especially since that's what appears to be what will be used in the ranking signals, as best I can tell.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Optimizing images and more for page speed
Hey everyone! I run a comparison/affiliate site for men's clothing. On the side of that, I have a Squarespace site for inspiration, articles and outfit pictures. I've tried to optimize site speed for the Squarespace site without much success. I've run all pictures trough JPEGmini to decrease file size but it doesn't seem to be enough. Below I attached the result I got when I run one of the pages trough Lighthouse and GTmetrix. Do you have recommendations of what I can do to improve the results? Is it a good idea to use next-gen formats for pictures as Google suggests as an example? Kind regards,
On-Page Optimization | | JonastriesSEO
Jonas XU131T0.png sPD3w13.png1 -
Google Is Ranking the Wrong Page
Howdy Folks- I have a case where Google is ranking the wrong page for a couple of different keywords. The home page is: http://healthtn.com Most notably, we're trying to optimize the home page for "Tennessee Health Insurance" but the below page is what continually ranks for it, and does so very poorly. We used to be page two with the home page, now we are page four and it ranks the following page. http://healthtn.com/tennessee/health-insurance/student I have started directing our Internal Linking to reflect the correct anchor text but succeeded in losing ranking for the term, but am still ranking the wrong page. Any thoughts or help would be much appreciated!
On-Page Optimization | | CRO_first0 -
Home page and category page target same keyword
Hi there, Several of our websites have a common problem - our main target keyword for the homepage is also the name of a product category we have within the website. There are seemingly two solutions to this problem, both of which not ideal: Do not target the keyword with the homepage. However, the homepage has the most authority and is our best shot at getting ranked for the main keyword. Reword and "de-optimise" the category page, so it doesn't target the keyword. This doesn't work well from UX point of view as the category needs to describe what it is and enable visitors to navigate to it. Anybody else gone through a similar conundrum? How did you end up going about it? Thanks Julian
On-Page Optimization | | tprg0 -
Page Not Indexed
Hi Guys I wrote and published an article last night on my site but it is yet to be indexed. This is strange as articles are usually indexed pretty quickly. Could you have a quick look and see what the problem is? http://www.rankmytri.com/tomtom-running-and-triathlon-watch/ Also all my Blog posts (in the blog section of the site) are not indexed as well (and I dont think they have been for a while) yet I dont have any messages from Google in my webmaster tools. Thoughts? Thanks in advance Ross
On-Page Optimization | | ross88guy0 -
Locating Duplicate Pages
Hi, Our website consists of approximately 15,000 pages however according to our Google Webmaster Tools account Google has around 26,000 pages for us in their index. I have run through half a dozen sitemap generators and they all only discover the 15,000 pages that we know about. I have also thoroughly gone through the site to attempt to find any sections where we might be inadvertently generating duplicate pages without success. It has been over six months since we did any structural changes (at which point we did 301's to the new locations) and so I'd like to think that the majority of these old pages have been removed from the Google Index. Additionally, the number of pages in the index doesn't appear to be going down by any discernable factor week on week. I'm certain it's nothing to worry about however for my own peace of mind I'd like to just confirm that the additional 11,000 pages are just old results that will eventually disappear from the index and that we're not generating any duplicate content. Unfortunately there doesn't appear to be a way to download a list of the 26,000 pages that Google has indexed so that I can compare it against our sitemap. Obviously I know about site:domain.com however this only returned the first 1,000 results which all checkout fine. I was wondering if anybody knew of any methods or tools that we could use to attempt to identify these 11,000 extra pages in the Google index so we can confirm that they're just old pages which haven’t fallen out of the index yet and that they’re not going to be causing us a problem? Thanks guys!
On-Page Optimization | | ChrisHolgate0 -
Duplicate Page Title
Hi Guys, First off, it's an honour to be a part of this awesome community. I'm using WordPress and getting top 3 rankings for great keywords and I'm very excited, however my page titles are in this format "keyword optimised title here - site name here" eg: "This is my keyword - this is the name of my blog", "This is another keyword - this is the name of my blog", "This is a longtail keyword - this is the name of my blog" SEOMoz is reporting errors because of duplicate page title tags due to the "this is the name of my blog" being in every page title. Will this hurt my rankings? Thanks in advance and keep up the great work! Cheers, Troy.
On-Page Optimization | | TroyDean710 -
Page Rank
I had just made a 301 re-direct on one of our product pages which had a PR of 4, now that Google has indexed the new page, it's now got a PR of 0, i'm struggling to understand why this could be, i know that you may see a drop of 1, which has happened in the past, but this drop just does not make sense. Any ideas of why this could be? Kind Regards
On-Page Optimization | | Paul780 -
® is displaying as o in the SERP pages
Hello All, A client's site was recently redesigned and since the redesign the ® is not displaying as such in the search engines when created using Option+R on a Mac. If it is created using ® it works fine. I suspect it has something to do with the programming or the CMS. Other sites on this CMS seem to not have this problem from what I have seen so far. http://www.google.com/search?aq=f&sourceid=chrome&ie=UTF-8&q=cliff+stevenson#sclient=psy&hl=en&q=cliff stevenson calgary realtor&aq=0&aqi=g1&aql=&oq=&pbx=1&bav=on.2,or.r_gc.r_pw.&fp=23c2bd992ac36d9d&pf=p&pdl=3000 As more time goes on, title tags and descriptions that are months or years old are also digressing to the 0. I have cued up a few tests and once crawled will have narrowed down whether it is a CMS issue or not. Won't have completely done so, but much closer. The CMS says they have made no changes that would have caused this. The programming gang on my end says they did nothing different on this site then on others so it can't be on us. The classic 'no way it can by us, it has to be them' from all angles. I am stuck in the middle trying to find the solution. Has anyone else ever come across this problem? Thoughts?
On-Page Optimization | | kyegrace0