I have seen that the pages Blocked by robots.txt on the Google search console have dropped to zero. I have no idea why; I have enclosed the screenshot and the code from my robot.txt page.
Any help will be appreciated.
we use Shopify as our ecommerce platform
{%- comment -%}
Caution! Please read https://help.shopify.com/en/manual/promoting-marketing/seo/editing-robots-txt before proceeding to make changes to this file.
{% endcomment %}
{% for group in robots.default_groups %}
{{- group.user_agent -}}
{% for rule in group.rules %}
{{- rule -}}
{% endfor %}
{%- if group.sitemap != blank -%}
{{ group.sitemap }}
{%- endif -%}
{% endfor %}
Sitemap: https://www.stargazer-products.com/sitemap.xml
User-agent: Googlebot
Disallow:
User-agent: Googlebot-image
Disallow:
User-agent: *
Disallow: /?variant=
Disallow: /?pr_prod
Disallow: /?cat=
Disallow: /?pf
Disallow: /?SID=
Disallow: /?p=
Disallow: /?___store
Disallow: /.atom
Disallow: /.oembed
Disallow: //catalogsearch
Disallow: /?ref=*
Disallow: //product_compare/
Disallow: //_ignore_category
Disallow: /view/id
Disallow: /blogs//tagged/
Disallow: /=uniform
Disallow: /?filter
Disallow: /collections/all*
Disallow: /?q=
