Why is a URL blocked by robots.txt on Google search console?

Topic summary

A Shopify site owner discovered URLs appearing in Google Search Console that are blocked by robots.txt, including cart addition URLs (e.g., /cart/add?id=...) and collection sorting parameters.

Key Points:

  • URLs contain query parameters for cart operations, access tokens, and sorting options
  • Some URLs appear reversed or garbled in the console output
  • The collection sorting URL (?sort_by=title-ascending) is a duplicate of the default sorting method

Resolution:

  • No action needed—blocked URLs appearing in Search Console is normal behavior
  • Cart pages shouldn’t be crawled by bots anyway
  • The sorting parameter URL is simply a duplicate of the default collection page

Status: Question answered; blocking these URLs via robots.txt is working as intended and requires no fixes.

Summarized with AI on November 13. AI used: claude-sonnet-4-5-20250929.

I didn’t understand why such url came in google search Console / Engine and after that url was blocked by robots.txt. Shopify issue

Eg : /cart/add?id=45457019928894&quantity=1
/collections/dirt-petrol-bike?sort_by=title-ascending
/cart/45391562867006:1?access_token=66a155ea5a45623f7af670ab4e90407c
/cart/45401580503358:1?storefront=true&access_token=66a155ea5a45623f7af670ab4e90407c

How to fixed it ?

You don’t fix this,

just because a url is blocked in a tool that doesn’t mean anything by itself.

There’s no reason for a bot to be crawling the cart page as it’ cannot purchase things.

And /collections/dirt-petrol-bike?sort_by=title-ascending is just a duplicate of /collections/dirt-petrol-bike if that’s the default sorting method.

1 Like