In the year in which we were all invited to give more weight to the technical factors that affect the satisfaction of the user’s experience on the site, Google could not help even the tool that for years, Now, it is just to measure the performance and optimize the pages. In fact, the new version of Pagespeed Insights is ready, which exceeds the limits encountered so far by users and is proposed as a perfect ally to overcome the challenges imposed by the Page Experience.
What is PageSpeed Insights
Google Pagespeed Insights (PSI) is a tool that provides reports on the performance of a page on mobile and desktop devices and tips on how to improve that page; among other things, measures the loading time of a URL, calculate a performance score and analyze the website for potential improvements.
Over the years, PSI has evolved into a unique source for field and laboratory data, integrating Chrome UX Report (CrUX) information and Lighthouse diagnostics to provide data that help improve website performance.
However, in recent times the problems with this tool have become more and more evident – which until now has been based on a code that is now 10 years old, which therefore needed a redesign – in particular with regard to the way the data is presented, because there was no clear separation between laboratory and field data.
Therefore, users who did not know enough about Pagespeed Insights had difficulty understanding the context of the data examined, and were therefore not in a position to know what to do with it, also because of the confusion created by the design of the tool.
What this tool is for and what kind of info it shows
As things stand today, the PSI report includes performance data for mobile and desktop devices in individual tabs and suggests how to improve a page.
The key components of the report in each case are similar and provide information on:
- Performance Score
The Performance Score is displayed at the top of the PSI report and summarizes the overall performance of the page. This score is determined by running Lighthouse to collect and analyze the lab data on the page. A score of 90 or higher is considered good, 50-90 must be improved and less than 50 is poor.
- Field data
Field data from the Crux report dataset provides insishts about the user’s real-world experience. The data include metrics such as First Contentful Paint (FCP) and measure Vitals Web Cores; along with those values, you can also see the distribution of pages where the value of a particular metric was Good, Needs improvements or Poor (Good, Needs Improvement, Poor), indicated by green, amber and red bars respectively.
The distribution and scores are displayed based on the page loads for users in the CrUX data set. Scores are calculated for the last 28 days and are not available for new pages where sufficient data may not be available on the actual user.
- Origin Summary
Users can click the Show Origin Summary checkbox to view the aggregate score for metrics for all pages published from the same source in the last 28 days.
- Lab data
The laboratory performance score, calculated using Lighthouse, helps debug performance problems, as it is collected in a controlled environment.
The report shows performance using metrics such as First Contentful Paint, Largest Contentful Paint, Speed Index, Cumulative Layout Shift, Time to Interactive and Total Blocking Time: each metric is rated and labeled with an icon indicating Good, It needs improvement or poor. This section provides a good indication of pre-release performance bottlenecks and can help diagnose problems, but may not detect real world problems.
- Audit
This section lists all audits performed by Lighthouse, with past audits along with opportunities for improvement and additional diagnostic information.
What changes with the new PageSpeed Insights version
With the update, which should see the light by the end of the year, Google hopes to make it easier for developers to interpret the report, so as to give them the opportunity to act quickly on the information presented.
As the article published on web.dev says, the new version aims to meet 3 main objectives:
- Make the user interface more intuitive, clearly distinguishing between data derived from a synthetic environment and data collected by users in the field.
- Clearly communicate how the Core Web Vitals evaluation is calculated in the user interface.
- Modernize the look and feel of PSI, using Material Design.
Update interventions to PSI
The redesign of the PSI user interface aims to improve the presentation of the report data and to add clarity and granularity to the data available in the report. The new user interface first wants to be more intuitive and helps developers quickly discover detailed information about laboratory and field performance for their pages.
- Clear separation of field and lab data
The tool now clearly separates the field data from that of the laboratory data: the previous labels for “Field data” and “Laboratory data” have been replaced with a text indicating the meaning of the data and how they can help. In addition, the traditional lab-based performance score, which was shown at the top, has been moved to the Lab Data section to avoid ambiguities about the origin of the score.
- Core Web Vitals evaluation
The result of the Core Web Vitals evaluation, which previously appeared as a single “outdated” or “outdated” word in Field Data, now stands out as a separate subsection with a distinct icon, without this leading to any change in the evaluation process for Core Web Vitals.
From a practical point of view, FID, LCP and CLS Core Web Vitals metrics can be aggregated at page or source level; for aggregations with sufficient data in all three metrics, the aggregation exceeds the Core Web Vitals rating if the 75 th percentile of all three metrics is Good. Otherwise, the aggregation shall not exceed the evaluation.
If the aggregation has insufficient FID data, it will pass the assessment if both the 75þ percentiles of LCP and CLS are good. On the contrary, if LCP or CLS has insufficient data, the page or aggregation at the origin level cannot be evaluated.
- Labels for mobile and desktop performance
On the design side, Google changed the navigation menu at the top and included shortcuts for mobile devices and desktops centrally in the report page. The links are now easily visible and clearly indicate the platform for which the data are displayed, and this intervention has also helped to make the navigation bar cleaner.
- Origin Summary
The Origin Summary provides the aggregate CrUX score for all pages from the source: it is currently displayed by clicking on a checkbox, but in the new version of PSI this section of the report is moved to a new tab, “Source”in the Data section field.