Why the Chrome User Experience Report changed the page speed measurement game

By Mat Davis | 10 Jan 2019

Since the mobile-focused Speed Update landed in July 2018, the need for more accurate data has never been greater. With user attention span and willingness to wait for a page to load on mobile devices shorter than on desktop, the race to optimise a site for speed is vital to maintain or improve organic performance and achieve conversion targets. Google’s own research underlines this; it’s found that “when page load time goes from one to three seconds the probability of a bounce increases 32%, illustrating that even for pages deemed ‘fast’, further optimisations could yield great rewards.

Of the huge number of tools available to test page speed, Google Page Speed Insights has always been a staple in the technical SEO toolkit. Updates in the last couple of years have seen the tool incorporate another of Google’s performance tools, Lighthouse, into its data sets to provide more granular detail on speed metrics.

Introducing field data

In late 2018, Google’s own speed tool saw a significant upgrade as it began to provide field data in addition to lab data. The traditionally available lab data provides key speed metric scores such as First Contentful Paint (when the first text or image is painted), Time To Interactive (the time the page is fully loaded), First Meaningful Paint (when the primary content is visible), and Speed Index (how quickly the contents of a page are visibly populated) in controlled and consistent simulated conditions.

The new field data (also known as Real User Monitoring) provides speed metrics based on a set of historical stats from real-world performance of users visiting the page using Chrome from a variety of locations and devices. The ability to capture and analyse a wide range of readily available real-world user experience data is a first and can provide surprising results in comparison to lab test data.

Previously, to gain a comprehensive understanding of how a site performs in the real world, tests would have to be carried out from a variety of tools and server locations with multiple different configurations of speed and device. This new addition to the tool provides a readily available, realistic snapshot of a site’s overall performance as well as the performance of its competitors.

While lab data remains useful for benchmarking and monitoring the implementation of speed optimisations, this new data set provides a more accurate way of understanding how Google perceives the speed of your site. With user location, connection strength, and device all influencing field data, speed optimisation should now focus on impacting the areas of improvement that real-world data demonstrates the greatest need for rather than targeting incremental improvements to lab data metrics.