i have over 488k of pages that are blocked by robot.txt
can anyone help me to solve this
it makes my site slow
khaled
A user reports 488k pages blocked by robots.txt on their Shopify site (mommyandme.online), believing this causes slow performance.
Key Response Points:
Performance Solution:
The slowness likely stems from other factors rather than robots.txt blocking. Suggested optimizations include:
Status: Issue remains unresolved; user needs to investigate whether blocked pages are actually problematic and address site speed through optimization rather than robots.txt modifications.
i have over 488k of pages that are blocked by robot.txt
can anyone help me to solve this
it makes my site slow
khaled
Hi @kldlaz
Shopify’s robots.txt file is automatically generated and cannot be directly edited, but you can manage crawling issues in a few ways.
Check if the blocked pages are necessary – Shopify blocks certain pages (e.g., cart, checkout, admin) to prevent indexing irrelevant content. You can review blocked URLs in Google Search Console > Coverage Report to see if any important pages are affected.
Improve site speed – If slow performance is a concern, check your site’s apps, image sizes, and theme optimization rather than focusing only on robots.txt.