Concern that Shopify’s robots.txt may be blocking Google from crawling top-level pages, potentially limiting product visibility. robots.txt is a file that tells search engine bots which URLs they can or cannot crawl.
One reply says Google can access main pages and products, recommending Google Search Console to check indexing status and identify issues. No specific crawl blocks were confirmed in that response.
Another participant compared robots.txt files across Shopify stores and found some have long disallow lists while others are minimal. Shopify support reportedly said the longer, more restrictive list is their recommended default, though the user remains unconvinced and seeks further advice.
A final comment claims such blocking will hurt rankings, but provides no evidence or examples. No technical diagnostics (e.g., Search Console reports) were shared in the thread.
Outcome: No resolution. Key next step suggested is verifying crawl/indexing in Google Search Console and reviewing the store’s robots.txt rules. The question of whether Shopify’s defaults are harmful remains open.
Summarized with AI on December 13.
AI used: gpt-5.
Have been told that is not allowing Google to access top-level pages, these shows that your products are not seen by potential buyers who are seeking them because the Bot crawler is DISALLOWED. Is this normal?
Please check robots.txt file seems google can find your main pages including product. You can use the Google search console to check your site for any indexing issues.
I am also receiving repeated messages about this issue. I followed the advice mentioned below and viewed the robots.txt extension on my store and competitor stores and found that my store and some others on shopify have a long list of items that are blocked. Whereas some other stores have very few lines of code in robots.txt.
I raised this with the shopify support team and they say that the long list is recommended by them. I’m still not 100% convinced as the CSR didn’t seem to have any knowledge of this issue.
I am also receiving repeated messages about this issue. I followed the advice mentioned above and viewed the robots.txt extension on my store and competitor stores and found that my store and some others on shopify have a long list of items that are blocked. Whereas some other stores have very few lines of code in robots.txt.
I raised this with the shopify support team and they say that the long list is recommended by them. I’m still not 100% convinced as the CSR didn’t seem to have any knowledge of this issue.