robots.txt.liquid not applying

Topic summary

A store owner is attempting to allow legal pages (e.g., /policies/terms) to be indexed by customizing robots.txt.liquid with Allow: /policies/. However, the live robots.txt still shows Disallow: /policies/ and Disallow: /*/policies/, preventing Google from indexing these pages.

Key points:

  • Theme is published and file is correctly placed in templates/ folder
  • Shopify Support confirmed the file looks correct but couldn’t escalate as it involves code
  • One respondent suggests Shopify intentionally overrides custom rules for /policies/ and /checkout/ paths to prevent indexing of sensitive content and avoid duplicate content issues
  • Another user reports their robots.txt appears correct (see attached screenshot) and suggests it may take time for search engines to re-index, recommending Google Search Console to speed up the process

Status: The discussion remains open with conflicting explanations—unclear whether this is a Shopify limitation or a timing/indexing delay issue.

Summarized with AI on October 28. AI used: claude-sonnet-4-5-20250929.

Hello Shopify Community,

I am trying to allow my legal pages (such as /policies/terms) to be indexed by customizing the robots.txt.liquid file in my published Shopify theme.

Here is the simplified version of my current file:

User-agent: *
Allow: /policies/
Allow: /checkouts/internal/preloads.js
Disallow: /checkouts/
Disallow: /checkout

Despite these settings, the live robots.txt at https://lalaloom.no/robots.txt still shows:
Disallow: /policies/
Disallow: /*/policies/

So Google continues to block those pages.
I have verified:

  • The theme is published
  • robots.txt.liquid is in the correct folder (templates/)
  • The file saves correctly

I also spoke with Shopify Support who confirmed the file looks correct, but mentioned they could not escalate the issue as it involves code.

Can anyone from the community or staff clarify whether Shopify storefronts respect robots.txt.liquid reliably? Or is this a known limitation?

Thank you!
Ali
https://lalaloom.no

1 Like

Hi,

Even though you’ve customized your robots.txt.liquid file in your theme, Shopify overrides or appends default disallow rules—especially for /policies/ and other sensitive or legal paths like /checkout/.

This is intentional by Shopify to prevent indexing of potentially sensitive or dynamically generated content that may cause duplicate content issues or legal problems.

It seems like your robots.txt is correct

I tested the robots.txt before to hide my product details pages. It might take time for Search Engines like Google to re-index your site. If you want to speed up, you can try Google Search Console.