How can I stop unwanted URLs from being indexed in Google?

Something to note: For the robots.txt file to work properly for Google, according to Shopify support documents, it must directly target the Googlebot crawler and not just crawlers generally. The reason you’re seeing more pages is because Google is seeing each URL string as a different page. As an example, if I have a single website page such as website.com, then the strings are added such as “website.com/pr_prod_strat=asdf” then Google will view this as a different page. It might take some time for Google to realize the robots.txt change but hopefully, they will fall off the list. Though these unindexed pages showing in GSC are far from ideal, they may not be completely detrimental, but not that I’m not certain - regarding crawl budget, Google mainly cares about sites that have close to or over a million pages to crawl. With that, I don’t think a couple hundred would affect your crawl budget, but may affect SEO in some way. Food for thought.