pages block by robot.txt

Topic summary

A user reports 488k pages blocked by robots.txt on their Shopify site (mommyandme.online), believing this causes slow performance.

Key Response Points:

  • Shopify’s robots.txt is auto-generated and cannot be directly edited
  • Blocked pages typically include cart, checkout, and admin sections—intentionally excluded to prevent indexing irrelevant content
  • Recommended to check Google Search Console > Coverage Report to identify if any important pages are affected

Performance Solution:
The slowness likely stems from other factors rather than robots.txt blocking. Suggested optimizations include:

  • Reviewing installed apps
  • Optimizing image sizes
  • Improving theme performance

Status: Issue remains unresolved; user needs to investigate whether blocked pages are actually problematic and address site speed through optimization rather than robots.txt modifications.

Summarized with AI on October 30. AI used: claude-sonnet-4-5-20250929.

i have over 488k of pages that are blocked by robot.txt

can anyone help me to solve this

www.mommyandme.online

it makes my site slow

khaled

Hi @kldlaz

Shopify’s robots.txt file is automatically generated and cannot be directly edited, but you can manage crawling issues in a few ways.

Check if the blocked pages are necessary – Shopify blocks certain pages (e.g., cart, checkout, admin) to prevent indexing irrelevant content. You can review blocked URLs in Google Search Console > Coverage Report to see if any important pages are affected.

Improve site speed – If slow performance is a concern, check your site’s apps, image sizes, and theme optimization rather than focusing only on robots.txt.