Google Core Web Vitals, guide to the new key metrics for the SEO

Let’s go back to Core Web Vitals, or simply Web Vitals, the new set of metrics of a site’s performance that help highlight aspects of web page development that affect user experience, such as page loading, interactivity and visual stability. Google has already anticipated that Web Vitals will be ranking factors as part of the Page Experience Update that awaits us in 2021, so it is more than ever appropriate to understand well how they work and how to measure these crucial parameters.

How Web Vitals metrics work

The first aspect to consider is that such metrics focus on completing certain events – including what is affected interactively or visually when these events occur – while the pages are loaded up to a point of stability related to the user experience, as explained by Detlef Johnson on Search Engine Land.

This means that the score values can change when users interact with the page, and in general you get better scores when events occur faster along the stopwatch time intervals.

The first three Core Web Vitals metrics

It should also be remembered that Google plans to periodically update the list of these metrics, although on an annual basis to avoid too frequently changing the goals to achieve for SEO and developers. For example, it seems that the next addition will measure the page animations, for now in development.

Anyway, for now there are three metrics that Google has defined vital for the web, namely:

  1. Largest Contentful Paint (LCP), which measures the time interval between the start of loading a page and the full rendering of the largest image or text block in a user’s view. The score may change during page loading and when the content is visible but the larger node remains in the backlog yet to be displayed; this becomes more noticeable on limited connection speeds.
  2. First Input Delay (FID), that is the time necessary for a page to be ready for the interactivity of the user, that is how long it takes the page, while it is assembled, to respond to click, scroll or input from keyboard elaborating the corresponding events managers. User interaction can be greatly delayed by the main thread-blocking script activities.
  3. Cumulative Layout Shift (CLS), which is the measured distance and the fraction of the viewport that moves due to manipulation of the DOM or lack of size attributes for major media elements. When we fail to define the size for our hero images, for example, the text on the pages seems initially only to be dislocated, causing a destructive “displacement” of the content layout for our users.

The three values of the metrics

Performance metrics for each Web Vital statistic are classified by three results:

  • Good (which means the promotion)
  • Improvements needed
  • Failed

Longtime users of Pagespeed Insights may be familiar with similar metrics, many of which will remain unchanged, though perhaps not all. Vitals Web Cores represent the culmination of these other metrics and with them derive from the complexity of the Developer Experience, simplified to allow all users (site owners, but also webmasters and developers) to count on a welcome clarity and less metrics, but larger, to follow.

How to analyze Web Vitals Score for desktop and mobile

You can analyze, measure and get independent scores for Web Vitals between mobile devices (phone) or desktop/ laptop.

In some tools you can specify which type of device you want to test or switch from one type to another when both are available. For example, Pagespeed Insights uses mobile statistics by default, so we’ll have to switch to the desktop tab to see the difference in scores between the two versions.

Google added the Core Web Vitals metrics to Search Console reports when Chrome User Experience data is available; in this case, in the tool’s dashboard they show both categories of devices with scores in Urls covered by indexing, and you can drill down groups of pages that indicate problems.

How the Chrome User Experience Report works

As part of its Chrome User Experience Report (CrUX), Google shows field data from over 18 million websites that have collected enough statistics to report the Vitals Web. The data are stored in Google’s BigQuery service, where you can query statistics from these websites dating back several years; updates are ongoing and available on the second Tuesday of each month, as a result of the accumulation, informs the article.

To see the scores of mobile and desktop devices using the new CrUX report, we will need “phone” or “desktop” as device form factors in our SQL instructions. It is interesting to note that “mobile” does not work as it is not a column and “tablet” only rarely works due to the scarcity of the specified data. Tablet data can be viewed in queries for Google’s origin (domain) for example, but you won’t see them for the quieter sites.

The difference between lab data and field data

To really understand metrics and scores, however, we must first get familiar with the concepts of lab data and field data, topic on which Detlef Johnson opens a useful parenthesis.

Web Vitals “lab” data are collected via the browser’s API as part of the timers for page loading events and mathematical approximations that simulate user interactivity. Instead, the “field” data consist of the same metrics collected from the actual user browsing experiences on our pages with the timer values of the resulting event transmitted to a repository.

The conditions of use can result in highly variable scores and the scores themselves can literally change while browsing the pages, which is why we need to understand how each score is tabulated against a given environment and really interpret the results only after we have determined for the first time whether we are examining the laboratory or field data.

Information on tools

We can access lab data in real time using Pagespeed Insights, Webpagetest , Chrome Dev Tools and via a new Chrome browser extension, “Web Vitals”. PSI and Webpagetest calculate scores from page loading events and approximate page interactivity delays, counting the timing of thread blocking script tasks.

Lab data tools are incredibly useful in your workflow for the creation of reports and score improvement and “should be part of your technical SEO arsenal”, the author recommends.

You can introduce the Vitals Javascript Web library into the workflow and test pipeline; available via CDN, the library can be included in the production HTML and written to transmit the data of the fields collected independently at the point where we want to compare them for reports.

Lighthouse provides various access points that can be useful in the development workflow and includes numerous additional tests that can help ensure compliance with modern Web standards, and can help us debug situations where we are experiencing and solving problems with Vitals Web.

The comparison between lab data and field data

Modern browsers, starting with Chrome, measure how users actually experience our website in nature via an embedded Javascript API. We can access it with any Javascript or choose one of the Google libraries modified according to our needs. Google collects and, as said, shows data in the field from Chrome users for its CrUX report and sometimes uses the same browser APIs.

There are several ways to access or view CrUX data: we can use connectors from Bigquery output to other Google services for generating dashboards, such as the default connector for DataStudio. It is easier to access data in the field when the site falls into CrUX, after verifying the property in the Google Search Console, because the dashboard displays data from the fields with an interface that allows you to drill down with clicks instead of writing SQL queries. Alternatively, we can simply use the PSI tool, which provides data dating back to a maximum of 28 days.

Troubleshooting of Web Vitals reports

Because of the dynamic nature of some of the timing and the way it is collected, we must always check the laboratory data by correlating the field data so that we can debug the discrepancies. For example, subsequent page uploads may vary the values of the results when using the Vitals Web extension, and this can happen for a couple of reasons, explains Johnson.

Our browser is able to assemble resources faster on refresh thanks to the use of its cache reserve; In addition, the extension can accumulate interactive values while browsing the page in a useful way to approximate real world field data, instead of calculating a score by adding thread-blocking script task times.

For more accurate local results using the Vitals Web extension and Chrome Dev Tools you must remember to empty the cache data or bypass them with shift-refresh when we move quickly with the web browser in the workflow. Another tip is to load “about: blank” before starting a performance recording session in Dev Tools for a new start to the report.

Ideally, laboratory and field scores do not differ too much without a good reason. Whenever significant changes are made, the laboratory results will be in advance of the field data. This means that if we view failed field tests and have passed laboratory scores, we have to be patient that the data in the field is collected or send the data in the field to Analytics independently to verify them.

How to manage the analysis of the three metrics

You might imagine that the most difficult field data score to locally emulate is CLS, but according to Johnson it’s not necessarily true: we can in fact set an option to apply an overlay of Web Vitals using the Chrome extension where, when you interact with the page, look at the score changes as we navigate.

This technique also works for FID: the score for this metric starts empty; with the first interaction on the page (click, scroll or keyboard input), the times of thread blocking activities are added to that moment, which becomes the score.

Finally, the highly detailed information in Chrome Dev Tools allows us to solve CLS problems in detail with recording and performance playback. We need to look for the “Experience” section that plays CLS shifts in the recording, and there is also a setting to highlight the displacements on the display using a blue flash that wraps the elements as they move and add to the score.

Which tools to use to measure Core Web Vitals

In the end, the SEO expert lists useful tools to measure the Core Web Vitals metrics, also offering details about their operation.


Google Search Console. After verifying the ownership of the website, we can deepen the problem areas with pages that are failing on the field – assuming participation in the CrUX. You can drill down to locate groups of pages with similar problems, and eventually this tool connects to Pagespeed Insights.
API Javascript Web Vitals. It allows you to use Javascript to access metrics directly from your browser and transmit them to a repository of your choice. Alternatively, we can insert the test into the development process and make sure that the changes made do not adversely affect the scores after pushing the production.
Chrome Dev Tools. Chrome itself provides the ultimate set of tools to uncover or track issues using the highly detailed information available in reports and log page loading on the Performance tab. The wide range of tools, switches and endless options are ideal for a more demanding optimization work.

chiusura, l’esperto SEO elenca i tool utili per misurare le metriche Core Web Vitals, offrendo anche dettagli sul loro funzionamento.

  • PageSpeed ​​Insights. It is the first tool to use to measure Vitals Web values. In the report we can obtain both laboratory data and field data (if available), but also many other metrics largely related to the improvement of pages with errors, in particular the results that influence the speed of a page and the download of its resources.
  • Estensione Web Vitals per Chrome. Using the Chrome extension you can access Web Vitals directly when loading the page and, as mentioned, interact with the page to troubleshoot any issues with First Input Delay and/or Content Layout Shift. It is also available from page to page when browsing websites.
  • WebPageTest. This independent test tool allows you to configure the approach with a variety of conditions; built by Google engineers who are part of the Chromium team, the information is authoritative and make available the RESTful APIs.
  • Google Search Console. After verifying the ownership of the website, we can deepen the problem areas with pages that are failing on the field – assuming participation in the CrUX. You can drill down to locate groups of pages with similar problems, and eventually this tool connects to Pagespeed Insights.
  • API JavaScript Web Vitals. It allows you to use Javascript to access metrics directly from your browser and transmit them to a repository of your choice. Alternatively, we can insert the test into the development process and make sure that the changes made do not adversely affect the scores after pushing the production.
  • Chrome Dev Tools. Chrome itself provides the ultimate set of tools to uncover or track issues using the highly detailed information available in reports and the page loading log on the Performance tab. The wide range of tools, switches and endless options are ideal for a more demanding optimization work.

The measuring of Web Vitals according to Google

Martin Splitt, Google’s Developer Advocate, also spoke on the subject, who from his official profile on Twitter dedicated a dozen tweets (collected on seroundtable) to respond to some requests and doubts submitted by various users, especially on Google’s analysis method for measuring Web Vitals.

Specifically, Splitt explains that Googlebot, as well as Lighthouse or the same Pagespeed Insights, measures lab data, or hypothetical performance data of a virtually ideal environment. So they are not representative data of what real users of the site and pages see. But in the Google Search Console Vitals Web report it is possible to see limited metrics (because not all Urls might have enough) but rUM, or rather field-based, tracking real users.

Page Experience will not use lab data

This also means that any low scores in this tool means that real people encounter difficulties with the pages of the site and may have poor UX, an element on which to intervene. In conclusion, the Googler informs us of an important news: the evaluation of the Page Experience will not use lab data to measure the Core Web Vitals (or at least it is not scheduled for now, he says).

Call to action

7 days for FREE

Discover now all the SEOZoom features!
TOP