hi
is it correct that the below url is blocked - https://infraredheatingsupplies.com/checkouts/internal/preloads.js?locale=en-GB
A user questioned whether blocking the URL https://infraredheatingsupplies.com/checkouts/internal/preloads.js?locale=en-GB in robots.txt is correct.
Resolution confirmed:
Additional clarification:
/external/ instead of /internal/)The issue was resolved with confirmation that no action is needed. A reference to Google’s robots.txt documentation was shared for further context.
hi
is it correct that the below url is blocked - https://infraredheatingsupplies.com/checkouts/internal/preloads.js?locale=en-GB
Hi,
Yes this is correct and expected. I had a similar issue recently, but it just means that search engine bots are blocked from accessing that file. Which is fine, because you generally only want bots to crawl pages with content, such as blogs, products, etc.
Hope that makes sense?
Regards
peppergray.shop
HI
thank you for your reply. So to confirm this page shouldnt be crawled and doesnt have any content etc, its just a backend page for the checkout or linked to the checkout and so it doesnt need to be indexed
That’s right. No need for it to be crawled.
HI
thank you for confirming. I assume its the same with external
Yes, the same for external.
If your issue is resolved, can you mark this thread as answered? Thanks and good luck with your store.
i found more insight on this question here…
https://developers.google.com/search/docs/crawling-indexing/robots/intro