How can I prevent the new indexing bug from creating useless pages on Google?

Topic summary

Shopify merchants report Google aggressively indexing blank [email removed] and later [email removed] pixel sandbox URLs, inflating “Indexed,” “Excluded by noindex,” and 404 counts in Search Console and, for some, coinciding with ranking drops. WPM = Web Pixels Manager; x-robots-tag is an HTTP header used to apply noindex without a meta tag.

Early community workarounds included robots.txt disallows and Search Console removals, but blocking prevented Google from seeing noindex. Shopify staff advised removing those disallows and not using redirects or changing domains.

Shopify’s platform-level changes: web-pixels-manager URLs now return 404; wpm URLs serve noindex via x-robots-tag; a fix prevents crawlers from executing the JavaScript that exposed pixel URLs. An official FAQ was published; Shopify says merchants generally need to do nothing and that 404/noindex growth is expected temporarily and not harmful.

Recent developments: July spikes in WPM 404s and noindex counts, which Shopify attributes to versioning and cleanup. Some merchants suspect robots.txt gaps and continue to see cluttered GSC reports, ask for exact timelines, and debate 404 vs 410 (Shopify cites Google saying practical equivalence).

Side issues: recommendations and collections/all URLs appearing; community suggested targeted robots.txt rules. Many screenshots, example URLs, and code snippets are central. Status: partially resolved; cleanup ongoing; timeframe unspecified.

Summarized with AI on January 24. AI used: gpt-5.

A week after this solution released, seems not much improvement on my GSC. This is the latest indexing I requested last week and just updated by Google today.

Anyone has similar outcome?

Screenshot 2023-05-22 at 10.20.31 AM.png