robots txt

Topic summary

The discussion addresses whether to change robots.txt settings from “Disallow” to “Allow” for all pages to improve search engine performance.

Consensus: This is NOT recommended.

Key reasons against allowing all pages:

  • Indexing low-value pages: Cart, checkout, login, and admin pages would get indexed, diluting SEO strength
  • Duplicate content problems: Filter pages, tag pages, and internal search results can create duplicate content issues that harm rankings
  • Crawl budget waste: Search engines have limited crawl budgets; unnecessary pages consume resources that should go to important content
  • Security concerns: Private or sensitive areas (like /admin/ or /cart/) should remain blocked

Recommended approach:
Optimize robots.txt strategically based on SEO best practices rather than blanket changes. Blocking pages with duplicate or meaningless content prevents search engines from wasting time on unnecessary crawling.

Summarized with AI on November 1. AI used: claude-sonnet-4-5-20250929.

Hello,

Do we need to edit all robots.txt pages from Disallow to Allow for the best website performance on search engines?

thanks

Hi @sibelle

Thank you for your question.
In fact, you shouldn’t switch all pages in the robots.txt file from Disallow to Allow.
Some pages contain duplicate or meaningless content, so blocking them is the right decision to prevent Googlebot from wasting time crawling unnecessary pages.

This is Richard from PageFly - Shopify Page Builder App
No, you should NOT change all Disallow rules to Allow in your robots.txt file just for better search engine performance. Doing so could lead to indexing issues, duplicate content problems, and even security risks. Instead, you should optimize your robots.txt based on best SEO practices.


:police_car_light: Why You Shouldn’t Allow Everything in robots.txt1. Unnecessary Pages Get Indexed

  • Allowing all pages can lead to search engines indexing low-value pages (e.g., cart pages, checkout, login, admin pages).
  • This can dilute your website’s SEO strength.
  1. Duplicate Content Issues

    • If search engines crawl filter pages, tag pages, or internal search pages, it can create duplicate content, affecting rankings.
  2. Wastes Crawl Budget (For Large Sites)

    • Search engines have a limited crawl budget.
    • If bots waste time crawling unnecessary URLs, important pages may not get indexed efficiently.
  3. Security Risks

    • Some private or sensitive areas (like /admin/ or /cart/) should not be crawled.

Hope my solution will help you resolve the issue.Best regards,
Richard | PageFly