Hi all - I see that Shopify has now integrated Google PageSpeed Insights into their analytics offering. I'm trying to gauge how seriously I should take this. My site is quite slow at the moment (although I know need to compress my images more), but I have seen a ton of other popular sites in 20 - 30 range. Shopify itself is only 60. Plus, it seems like a lot of the issues impacting speed come from Shopify itself, and are therefore unfixable (at least for the layperson).
So is this score of any value at all? When I run my site through other testers such as GTMetrix I do fine, mainly As and Bs. Is PageSpeed Insights just really strict?
First, some clarification on pagespeed insights VS GTmetrix.
GTmetrix bases it's pagespeed tab off of a old version of pagespeed insights. It's important to know that GTmetrix calculates it's score based off improvements that can be made, and weights every metric evenly.
The actual pagespeed insights creates your score bases off user expierience metrics .
The time it takes to load the top of the page, the time it takes to load the largest element on your page, the time it takes to finish loading every file on the page.
Now pagespeed insights has lab data, and live data, lab data calculates how it estimates the average user will expierience the page load. Live data is actual users of the site and how they expierience the site.
Its all about your audience.
If your site has bad live data, then I'd be worried, as you might be loosing customers. Otherwise, I wouldn't worry about it.
Thanks for the feedback. Sounds like that distinction between lab and live results is important.
I do find it a little crazy though when huge sites like Amazon (scored 56) or, a little smaller, MyTheresa (scored 12!) or Farfetch (12!), seem perfectly usable even though Google ranks them so badly. These sites have the IT resources to get a good score if it was worth it. It makes wonder what use PageInsights is at all, at least for user experience, and therefore why is Shopify promoting this metric to its shops.
Its just a tool to help better understand how user's experience your site.
However, since google revealed pagespeed score affects page ranking (without ever revealing how exactly of course) people started worrying about their score.
The truth is if your audience is a developed nation, then you don't need to worry too much about PageSpeed score.
The reason these big site's don't worry about their score too much is because their not too worried about developing nations where internet is slower, or else they'd worry alot more about their score. Internet speed and pricing are not created equal, they're plenty of countries where the average citizen would spend 10 seconds loading youtube.
I think the takeaway is not to obsess over a score, but to analyze your audience and how they experience your site.
Hey folks. I work at Google and closely with PageSpeed Insights and related teams, a few comments and clarifications on the discussion above..
Google Search recently published information on the upcoming page experience signals and lays out what metrics site owners should focus on. In particular, I want to call out two aspects: Core Web Vitals, and real user experience measurement. The former consists of Largest Contentful Paint (LCP; measures loading performance), First Input Delay (FID; measure responsiveness), and Cumulative Layout Shift (CLS; measures visual stability). The latter speaks to assessing performance of the page based on how real world users experienced it.
PageSpeed Insights provides a score, which is derived based on a synthetic test and should be read as a high-level signal of how well the page aligns with recommended best practices. It also provides CrUX data which reflects how real-world Chrome users experienced the page and site — the Search page experience signal is based on this. This is an important distinction to keep in mind: synthetic tests have many baked in assumptions and limitations, which means that the score from an individual run is not a guarantee of performance out in the wild. You need to monitor both, see related documentation here.
Amazon is a good example to consider: the PSI score may not be as high as you might expect, but real-world experience observed by users is actually really good and far above most other sites. Which is to say, Amazon is definitely not ignoring performance, it's just that the assumptions in the synthetic test executed by PSI may not align well with Amazon(.com's) user population. Also, I can say with high confidence, because I've worked with teams across many of these largest ecommerce sites: they care a huge deal about regions of the world with slower connection speeds, data caps, etc., and you should as well if you have customers in those regions — speed is only more important and sensitive issue for users in the regions.
Here's a snapshot of quick benchmark report, which illustrates the point:
You can generate your own, with custom list of sites, if you install the Core Web Vitals app... or use PageSpeed Insights!