How can I fix the 'blocked by robots.txt' issue on Google Search Console?

Hi everyone,

Hope someone can help.

My site is brand new, i have just started using google search console and being indexed, etc.

I had an email from google sc today telling me blocked by robots.txt and couldn’t index.

the 3 blocked pages are below, thanks for your help in advance

https://feelwelluk.co.uk/policies/shipping-policy

Disallow: /preview_script_id
Disallow: /policies/
Disallow: //?ls=&ls=*
Disallow: //?ls%3D%3Fls%3D*

https://feelwelluk.co.uk/account

Disallow: /78927298884/orders
Disallow: /carts
Disallow: /account
Disallow: /collections/sort_by
Disallow: /*/collections/sort_by

https://feelwelluk.co.uk/cart

User-agent: *
Disallow: /admin
Disallow: /cart
Disallow: /orders
Disallow: /checkouts/

Regards

Ryan

Those feel like reasonable things to not have indexed. Are you thinking you really want them indexed for some reason?

i don’t need them indexed just didn’t want it affecting my ranking, if this isn’t the case then perfect