Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Web Core Vitals and Page Speed Insights Not Matching Scores
-
We have some URLs that are being flagged as poor inside Search Console in the Web Core Vitals report. For example, the report is saying that some pages have too many CLS issues.
When looking into things we can do to update we have noticed that when we run the same page using the PageSpeed Insights tool we are not getting the same bad scores. This is making it hard for us to actually know what needs to be addressed. Nor can we tell if a change actually fixed the issue because in PageSpeed Insights there is not an issue. Has anyone else had similar issues. If so have you found a way to fix it?
-
To my understanding, GSC is reporting based on "field data" (meaning the aggregate score of visitors to a specific page over a 28 day period). When you run Page Speed Insights, you can see both Field Data and "lab data". The lab data is your specific run. There are quite a few reasons why field data and lab data may not match. One reason is that changes have been made to the page, which are reflected in the lab data, but will not be reflected in the field data until the next month's set is available. Another reason is that the lab device doesn't run at the exact same specs as the real users in the field data.
The way I look at it is that I use the lab data (and I screen print my results over time, or use other Lighthouse-based tools like GTMetrix, with an account) to assess incremental changes. But the goal is to eventually get the field data (representative of the actual visitors) improved, especially since that's what appears to be what will be used in the ranking signals, as best I can tell.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
The particular page cannot be indexed by Google
Hello, Smart People!
On-Page Optimization | | Viktoriia1805
We need help solving the problem with Google indexing.
All pages of our website are crawled and indexed. All pages, including those mentioned, meet Google requirements and can be indexed. However, only this page is still not indexed.
Robots.txt is not blocking it.
We do not have a tag "nofollow"
We have it in the sitemap file.
We have internal links for this page from indexed pages.
We requested indexing many times, and it is still grey.
The page was established one year ago.
We are open to any suggestions or guidance you may have. What else can we do to expedite the indexing process?1 -
Is page speed important to improve SEO ranking?
I saw on a SEO Agency's site (https://burstdgtl.com/search-engine-optimization/) that page speed apparently affects Google ranking. Is this true? And if it is, how do I improve it, do I need an agency?
On-Page Optimization | | jasparcj0 -
Page Title Length
Hi Gurus, I understand that it is a good practice is to use 50-60 characters for the a page title length. Google appends my brand name to the end of each title (15 characters including spaces) it index. Do I need to count what google adds as part of the maximum recommended length? i.e.
On-Page Optimization | | SunnyMay
is the maximum 50-60 characters + the 15 characters brand name Google adds to the end of the title or 50-60 including the addition? Many thanks!
Lev0 -
Why are http and https pages showing different domain/page authorities?
My website www.aquatell.com was recently moved to the Shopify platform. We chose to use the http domain, because we didn't want to change too much, too quickly by moving to https. Only our shopping cart is using https protocol. We noticed however, that https versions of our non-cart pages were being indexed, so we created canonical tags to point the https version of a page to the http version. What's got me puzzled though, is when I use open site explorer to look at domain/page authority values, I get different scores for the http vs. https version. And the https version is always better. Example: http://www.aquatell.com DA = 21 and https://www.aquatell.com DA = 27. Can somebody please help me make sense of this? Thanks,
On-Page Optimization | | Aquatell1 -
Each page with a different meta description?
each page on my website represents a different department, can I program the header to show a different meta description on each page or should there only be 1 meta description tag per domain?
On-Page Optimization | | RonnieT0 -
How do I get other pages to show in SERPs
Why is it that when you google a domain like yahoo.com you sometimes get a main SERP and 6 sub SERPs below it. This concerns the 1st position.
On-Page Optimization | | ribandhull0 -
Category Pages with Sub-Categories
The image will explain it all... Each category page starts on the subject of the first sub-category page. This happens twice (well actually 3 times since this section of the site is called showroom and it starts on the tab mowers). Is this a terrible approach? If so, how could a site like this be better navigation-ally organized. cat-subcat.png
On-Page Optimization | | drewschmaltz0 -
Avoiding "Duplicate Page Title" and "Duplicate Page Content" - Best Practices?
We have a website with a searchable database of recipes. You can search the database using an online form with dropdown options for: Course (starter, main, salad, etc)
On-Page Optimization | | smaavie
Cooking Method (fry, bake, boil, steam, etc)
Preparation Time (Under 30 min, 30min to 1 hour, Over 1 hour) Here are some examples of how URLs may look when searching for a recipe: find-a-recipe.php?course=starter
find-a-recipe.php?course=main&preperation-time=30min+to+1+hour
find-a-recipe.php?cooking-method=fry&preperation-time=over+1+hour There is also pagination of search results, so the URL could also have the variable "start", e.g. find-a-recipe.php?course=salad&start=30 There can be any combination of these variables, meaning there are hundreds of possible search results URL variations. This all works well on the site, however it gives multiple "Duplicate Page Title" and "Duplicate Page Content" errors when crawled by SEOmoz. I've seached online and found several possible solutions for this, such as: Setting canonical tag Adding these URL variables to Google Webmasters to tell Google to ignore them Change the Title tag in the head dynamically based on what URL variables are present However I am not sure which of these would be best. As far as I can tell the canonical tag should be used when you have the same page available at two seperate URLs, but this isn't the case here as the search results are always different. Adding these URL variables to Google webmasters won't fix the problem in other search engines, and will presumably continue to get these errors in our SEOmoz crawl reports. Changing the title tag each time can lead to very long title tags, and it doesn't address the problem of duplicate page content. I had hoped there would be a standard solution for problems like this, as I imagine others will have come across this before, but I cannot find the ideal solution. Any help would be much appreciated. Kind Regards5