How can I prevent the new indexing bug from creating useless pages on Google?

@Denny The previous code loaded the iframe from javascript also. The difference now is that they moved the javascript for it into it’s own js file and blocked that file in robots.txt.

The block in robots.txt will prevent Google from rendering the code that generates the iframe and since it’s a js file blocked instead of a page, Page Indexing in Search Console should be clean - though I’d expect an increase in 404 errors as the [email removed] urls now 404.