How can I prevent the new indexing bug from creating useless pages on Google?

@Greg-Bernhardt

Hi Greg

Response please to my other questions:

I think part of the problem is that shopify has created many url options of this sandbox pixel and now I have them scattered around in Google Page Indexing - I have versions in:

Blocked by Robots.txt

Crawled - currently not indexed

Excluded by ‘noindex’ tag

Not found (404)

So can you confirm exactly what steps are needed to clean up this mess for each of these 4 categories?

Reading thru your comments, it seems the solution is to do nothing and let the shopify created url fix the issue. If so, you need to go back to all the versions of this sandbox url shopify created and make them not not indexable ie as the wpm@0.0.245@ variant

https://www.dropbox.com/s/83dlit8nj3h96wi/Screenshot%202023-03-26%2011.08.55.png?dl=0

From our side;

We should not add a no-index, no-follow tag in Robots.txt for the sandbox url’s?

We should not disavow the sandbox url’s?

Thanks

Denis

as an aside why was this code not tested before it went live to ensure it functioned correctly?