My templates/robots.txt.liquid was updated to remove Disallow: /de/, Disallow: /fr/, and Disallow: /nl/ rules. The live file at https://henrypawhaven.com/robots.txtconfirms the correct version is saved — no locale disallow rules present.
However, Google Search Console Live Test (tested multiple times on April 6, 2026) continues to show the old blocked version being served to Googlebot, with “Page cannot be crawled: Blocked by robots.txt” for all locale URLs.
What I have verified:
robots.txt.liquid is clean — no locale disallow rules
No robots directives in theme.liquid, meta-tags.liquid, or any other snippet
Only one robots.txt.liquid file exists in the theme
No app remnants found in any theme file
Live browser fetch of henrypawhaven.com/robots.txt shows the correct clean file
The discrepancy: The file Shopify serves to browsers is correct. The file Shopify serves to Googlebot is the old blocked version. This suggests a CDN-level cache that is not clearing despite the template being updated and saved.
Timeline:
March 21, 2026: Locale disallow rules added via a now-uninstalled SEO app
April 6, 2026 morning: Rules removed from template
April 6, 2026, 11:46 AM CEST: GSC Live Test still reads blocked version
Question: Has anyone experienced Shopify’s CDN serving a stale robots.txt to Googlebot after updating robots.txt.liquid? Was there a workaround or did it require Shopify infrastructure intervention?
Support ticket open and escalated. Posting here for community visibility and any faster workarounds.
For starters, when you run a live test, that’s not being crawled and indexed by Google. All it does is check the current status. You have to request re-indexing. Then you wait for it to be crawled. Once it’s crawled again it will index again. Easy. It’s not complicated. It’s not hard. There is no need to waste people’s time on erroneous support tickets.
Ah, there it is again, a very generic response. Thank you Maximus3 for making the usual assumptions. What do you think was the very first thing that was done?
Advice from my side, read the actual post before dropping a comment that is not particularly useful. Neither is it clear what you’re trying to achieve.
dismissing a post like this might reflect back, think about that
In addition, the ticket was acknowledged and escalated by support. For those who had dismissive comments: nice job trying to discourage people from asking questions or posting at all. Making the community less valuable with every dumb reply.
March 21, 2026: Issue begins. Ticket opened. Support confirmed forwarding to Dev team.
April 6, 2026: Re-inquiry after no resolution. GSC Live Test still serving blocked version to Googlebot.
April 10, 2026: Support reply reveals the ticket was never forwarded to Dev despite March 21 confirmation. Support escalates immediately — again.
April 14, 2026: Specialist Support acknowledges CDN-level cache issue, orders force-invalidation, requests 5 business days for Infrastructure & Engineering to resolve.
April 18, 2026: SLA window passed. No resolution. No confirmation. Further SEO damage accumulating — 28 days and counting.
For anyone in a similar situation: updating robots.txt.liquid is insufficient if the CDN has cached a stale version at infrastructure level. There is no merchant-accessible workaround. This requires Shopify-side intervention.
Specific question for the community:
Has anyone successfully triggered a CDN cache purge for robots.txt on Shopify’s infrastructure — either through a support escalation path that actually worked, or any other mechanism? Specifically: is there anything on the merchant side (theme republish, domain disconnect/reconnect, specific support escalation wording) that forced a CDN invalidation in your case?