My Robots.txt file looks incorrect

Topic summary

A user is concerned that their robots.txt file contains too many “disallow” directives for pages they believe should be indexed by search engines.

Current Status:

  • The robots.txt file appears to follow a default configuration according to one respondent
  • No specific problematic pages have been identified yet

Key Points Raised:

  • Some pages (checkout, cart, admin) are typically and appropriately blocked for security and SEO purposes
  • Important content pages like product or collection pages should generally be indexable
  • The user needs to specify which exact pages are being disallowed to determine if adjustments are necessary

Next Steps:
The discussion remains open, awaiting the original poster to provide specific examples of pages they believe are incorrectly blocked so others can assess whether the robots.txt configuration needs modification.

Summarized with AI on October 30. AI used: claude-sonnet-4-5-20250929.

https://infinitelyfaithful.com/robots.txt

I have a lot of pages that are labeled as “disallow” that I feel should be indexed. Does this look normal? Thank you for any help that you can provide.

1 Like

Looks default’ish.

Can you tell which lines you see as wrongfully disallowed?

1 Like

Hello @inffaithful
It depends on which pages are labeled as “disallow.” Some pages, like checkout, cart, and admin, are typically blocked for security and SEO reasons. However, if important pages (such as product or collection pages) are being disallowed, you may need to adjust your robots.txt settings. Could you share specific examples of the pages you believe should be indexed?