How, on what devices, how often is site speed actually measured?

Topic summary

Main issue: Clarifying how Shopify’s site speed report is measured—whether it uses real visitor data or controlled tests—and what devices, locations, browsers, and sampling frequency are involved, especially compared to Google Analytics (GA).

Latest update: The original poster found a Shopify help doc stating speed tests run in Shopify’s own test environment, not on real visitors’ devices. Open questions remain on the exact test devices, browsers/OS, geographic location, how often tests run, and how many samples inform the daily score.

Context vs GA: GA’s site speed can be unreliable due to low sample counts (e.g., users blocking GA), but offers detailed breakdowns and correlations. The thread seeks similar transparency from Shopify.

Differing views: Another participant notes Google PageSpeed Insights (PSI) discloses its emulated device and network conditions for lab scores. They argue Shopify’s speed score lacks actionable detail and is only marginally useful for comparing against “similar stores,” not for assessing real user experience.

Status: Unresolved. Key methodological details (devices, frequency, sample size) are still unknown.

Summarized with AI on February 3. AI used: gpt-5.

The site speed reports are nice and helpful. But I’m wondering how accurate they are, based on number of samples, range of devices, locations of users etc.

How are these speeds actually measured? Are samples taken directly from genuine site visitors? Or does Shopify run these tests automated on their own machines? If yes to the latter, what test devices are used? Which browsers, operating systems, etc.? From what geographical location? How many tests are run? Etc.

I had some trouble getting accurate page load speeds from Google Analytics, because the number of samples actually sent was too low. Lots of visitors probably blocking Google Analytics. So I wonder how Shopify does it and how reliable their speed measurements are e.g. compared to Google Analytics.

In Google Analytics one can (in theory, of course) break down everything by many different factors, can analyze correlations to other metrics like conversion rate etc. One can zoom in and look closer… and see that the numbers are inaccurate because the number of samples is so low and unsteady.

In Shopify I don’t know how many samples are taken and how they are distributed over time, and have no idea if these measurements even come from real visitors.

Can anyone enlighten me about these questions?

I see that one of my questions is answered here: https://help.shopify.com/en/manual/online-store/store-speed/speed-report#the-shopify-test-environment

The speeds don’t run on real visitors’ devices, but on Shopify’s own in a test environment.
The question still remain what kinds of devices these are, and how often they run, how many per day to calculate the daily score.

I would be interested in knowing the answer too.
At least pagespeed Insights let’s you know the device and speed they emulate for the lab scores.

Honestly, Shopify’s pagespeed score are useless, as they give 0 insights, and don’t cover anything that can’t be covered by PSI, the only arguable use is that you can compare the average of scores accross “Similar” stores, so you know if your speed is above average or bellow it. This is however useless in a pratical sense, as it doesn’t tell you if your users are actually having a good expierence or not, which is what matters, or your PSI score, if you are obesessed with a 100% score there.