Hi, there
robots txt has blocked thousands of pages
Is it possible to help delete some rules to make the blog and collection section allow to crawl
Hi, there
robots txt has blocked thousands of pages
Is it possible to help delete some rules to make the blog and collection section allow to crawl
Hi @mythgreece
To allow your blog and collection sections to be crawled by search engines, you’ll need to edit your robots.txt
file. Here’s how you can do it:
robots.txt
Rules- Suggested Changes
To enable crawling for the blog and collections, you can modify the existing robots.txt
file by either removing the specific Disallow
rules or adjusting them. Here’s a simplified example:
User-agent: *
Disallow: /admin/
Disallow: /cart/
Disallow: /orders/
Disallow: /checkout/
Disallow: /checkout/
Disallow: */collections/*sort_by*
Disallow: /collections/*
Disallow: /blogs/*
Disallow: /blogs/*?*
Allow: /blogs/ # Allow crawling for all blog sections
Allow: /collections/ # Allow crawling for all collections
Sitemap: [https://www.example.com/sitemap.xml](https://www.example.com/sitemap.xml) # Update to your sitemap URL
robots.txt
File:Please note that :
Disallow
rule should be carefully reviewed.robots.txt
may take time to reflect. Be patient as search engines re-crawl your site.If you have additional questions about specific rules or further steps, feel free to ask me!
Hi @mythgreece
You can follow this guide:
If You’ll want to add Allow
rules for the blog and collections. Here’s a simple code snippet you can use:
# we use Shopify as our ecommerce platform
{% for group in robots.default_groups %}
{%- group.user_agent -%}
{%- for rule in group.rules -%}
{- rule -}
{% endfor %}
{% if group.sitemap != blank -%}
{{ group.sitemap }}
{% endif %}
{% endfor %}
# Allow crawling for blog and collections
Allow: /blogs/
Allow: /collections/
Save Your Changes: After adding that code, make sure to save the modifications.
Then Test It: Once you’ve saved the changes, go to Google Search Console and use the URL Inspection Tool to check if your blog and collection pages are now allowed to be crawled.
Please note that
Allow
rules don’t conflict with any existing Disallow
rules in the file.If you have any questions about this process or need further help, just let me know!
Hi @mythgreece
Shopify recommends avoiding direct edits to the robots.txt file.
If you’re uncertain, you can use an SEO app for safer optimization. Alternatively, if you prefer not to use an app, you can follow the official guide provided by Shopify here:
Shopify: Editing the robots.txt file.
Best,
Daisy - Avada Support Team.