robots.txt modification - remove multiple Disallow

I’m a very rusty coder (Software engineering degree 20 years ago) so can hack at code but I’m rough. What is the structure of the for(unless()) I need in robots.txt to let the google spider see them?

FYI I’ve got a load of collection/filter(+filter) pages showing up in Google search results that I don’t want with an “Indexed, though blocked by robots.txt” on search console. I’ve modded the code of these pages to tell it ‘no index’ (and point it at the canonical URL) but I need to update robots.txt so it visits these pages and sees that.

I figured “else unless” would work like “else if” but it is not working as expected, probably me missing something basic on the logic front (As mentioned I’m very rusty!).

How would you code robots.txt.liquid to get shopify to allow the google spider to visit the following:

User-agent: *
Disallow: /blogs/+
Disallow: /blogs/%2B
Disallow: /blogs/%2b

Any help much appreciated!