Hello,
Do we need to edit all robots.txt pages from Disallow to Allow for the best website performance on search engines?
thanks
The discussion addresses whether to change robots.txt settings from “Disallow” to “Allow” for all pages to improve search engine performance.
Consensus: This is NOT recommended.
Key reasons against allowing all pages:
Recommended approach:
Optimize robots.txt strategically based on SEO best practices rather than blanket changes. Blocking pages with duplicate or meaningless content prevents search engines from wasting time on unnecessary crawling.
Hello,
Do we need to edit all robots.txt pages from Disallow to Allow for the best website performance on search engines?
thanks
Hi @sibelle
Thank you for your question.
In fact, you shouldn’t switch all pages in the robots.txt file from Disallow to Allow.
Some pages contain duplicate or meaningless content, so blocking them is the right decision to prevent Googlebot from wasting time crawling unnecessary pages.
This is Richard from PageFly - Shopify Page Builder App
No, you should NOT change all Disallow rules to Allow in your robots.txt file just for better search engine performance. Doing so could lead to indexing issues, duplicate content problems, and even security risks. Instead, you should optimize your robots.txt based on best SEO practices.
Duplicate Content Issues
Wastes Crawl Budget (For Large Sites)
Security Risks
Hope my solution will help you resolve the issue.Best regards,
Richard | PageFly