Unmodified robots.txt stopped blocking pages for 10 days?

The URLs being blocked are the proper ones, there’s no issue with what Shopify is blocking now or what they were blocking in the past. The problem is that for ten days they weren’t blocking anything, at least according to Google Search Console.

Shopify has elsewhere told me they had absolutely nothing to do with robots.txt failing even though it’s a file they’re in complete control of. Since posting I’ve had two store owners come to me showing the same drop in blocking via robots.txt during those dates.