https://infinitelyfaithful.com/robots.txt
I have a lot of pages that are labeled as “disallow” that I feel should be indexed. Does this look normal? Thank you for any help that you can provide.
A user is concerned that their robots.txt file contains too many “disallow” directives for pages they believe should be indexed by search engines.
Current Status:
Key Points Raised:
Next Steps:
The discussion remains open, awaiting the original poster to provide specific examples of pages they believe are incorrectly blocked so others can assess whether the robots.txt configuration needs modification.
https://infinitelyfaithful.com/robots.txt
I have a lot of pages that are labeled as “disallow” that I feel should be indexed. Does this look normal? Thank you for any help that you can provide.
Looks default’ish.
Can you tell which lines you see as wrongfully disallowed?
Hello @inffaithful
It depends on which pages are labeled as “disallow.” Some pages, like checkout, cart, and admin, are typically blocked for security and SEO reasons. However, if important pages (such as product or collection pages) are being disallowed, you may need to adjust your robots.txt settings. Could you share specific examples of the pages you believe should be indexed?