Robot txt edit

Hi, there

robots txt has blocked thousands of pages

Is it possible to help delete some rules to make the blog and collection section allow to crawl

Hi @mythgreece

To allow your blog and collection sections to be crawled by search engines, you’ll need to edit your robots.txt file. Here’s how you can do it:

Current robots.txt Rules

  1. Disallowed Directories:
  • Many specific rules are set to disallow crawling of various blog and collection pages.

- Suggested Changes

To enable crawling for the blog and collections, you can modify the existing robots.txt file by either removing the specific Disallow rules or adjusting them. Here’s a simplified example:

User-agent: *
Disallow: /admin/
Disallow: /cart/
Disallow: /orders/
Disallow: /checkout/
Disallow: /checkout/
Disallow: */collections/*sort_by*
Disallow: /collections/*
Disallow: /blogs/*
Disallow: /blogs/*?*
Allow: /blogs/ # Allow crawling for all blog sections
Allow: /collections/ # Allow crawling for all collections
Sitemap: [https://www.example.com/sitemap.xml](https://www.example.com/sitemap.xml) # Update to your sitemap URL
  • You can follow steps to Update
  1. Access Your robots.txt File:
  • Via your website’s root directory (for Shopify, you might need to access it through the admin panel).
  1. Edit the File:
  • Use the above template as a guide and specify which sections you want search engines to crawl.
  1. Test Changes:
  • Use Google’s Rich Results Testing Tool or the URL Inspection Tool in Google Search Console to verify that the changes have been applied correctly.
  1. Monitor Changes:
  • After making changes, monitor your indexing status in Google Search Console to ensure that the pages are being indexed.

Please note that :

  • Avoid Over-disallowing: Ensure that important sections aren’t inadvertently blocked. Each Disallow rule should be carefully reviewed.
  • Caching: Changes to robots.txt may take time to reflect. Be patient as search engines re-crawl your site.

If you have additional questions about specific rules or further steps, feel free to ask me!

Appreciate your answer.

What if my robots txt look like this. What specific code should I add?

Hi @mythgreece

You can follow this guide:

If You’ll want to add Allow rules for the blog and collections. Here’s a simple code snippet you can use:

# we use Shopify as our ecommerce platform  
{% for group in robots.default_groups %}  
  {%- group.user_agent -%}  
  {%- for rule in group.rules -%}  
    {- rule -}  
  {% endfor %}  
  
  {% if group.sitemap != blank -%}  
    {{ group.sitemap }}  
  {% endif %}  
{% endfor %}  

# Allow crawling for blog and collections  
Allow: /blogs/  
Allow: /collections/
  • Save Your Changes: After adding that code, make sure to save the modifications.

  • Then Test It: Once you’ve saved the changes, go to Google Search Console and use the URL Inspection Tool to check if your blog and collection pages are now allowed to be crawled.

Please note that

  • Check for Conflicts: Before you save, make sure these new Allow rules don’t conflict with any existing Disallow rules in the file.
  • Monitor Your Pages: Keep an eye on indexing status in Google Search Console after making these changes.

If you have any questions about this process or need further help, just let me know!

Hi @mythgreece

Shopify recommends avoiding direct edits to the robots.txt file.

If you’re uncertain, you can use an SEO app for safer optimization. Alternatively, if you prefer not to use an app, you can follow the official guide provided by Shopify here:
Shopify: Editing the robots.txt file.

Best,
Daisy - Avada Support Team.