Search engines, such as Google, constantly crawl the internet in search of new data as a source for their search results. The robots.txt file tells search engine bots, known as crawlers, which pages to request to view from your online store.
All Shopify stores have a default robots.txt file that's optimized for Search Engine Optimization (SEO). Previously editing this file wasn’t possible but now that’s changed!
With a recent update, it’s now possible to customize your robots.txt from the optimized version that is being used currently in your shop. A few of the edits you might consider include:
Allow, or disallow certain URLs from being crawled
Adding crawl-delay rules
Adding extra sitemap URLs
Blocking certain crawlers
Please be mindful that this is an unsupported customization, and incorrectly applied edits can have significant impacts on your traffic. If you are encountering issues after modifying robots.txt, restoring the default robots.txt configuration is possible. For guidance on how to do this please review: Delete robots.txt.liquid customizations.
If you would like assistance customizing your robots.txt file you will need to hire a Shopify Expert as Shopify Support can not help with the edits you would like to make, nor provide feedback before implementing changes.
Check out the links below to learn more about this new feature, as well as reviewing some examples found in our developer documentation. If you have any questions about this feature leave a reply below!
If you are looking for community feedback on some edits you are looking into, consider posting a detailed message on our Technical Q&A discussion board.
Trevor | Community Moderator @ Shopify - Was my reply helpful? Click Like to let me know! - Was your question answered? Mark it as an Accepted Solution - To learn more visit the Shopify Help Center or the Shopify Blog