Same here.
The second line of the .atom page is something like that
<feed xml:lang="en" xmlns="http://www.w3.org/2005/Atom" xmlns:opensearch="http://a9.com/-/spec/opensearch/1.1/" xmlns:s="http://jadedpixel.com/-/spec/shopify">
and should include noindex like that
<feed xml:lang="en" xmlns="http://www.w3.org/2005/Atom" indexing:index="no" xmlns:opensearch="http://a9.com/-/spec/opensearch/1.1/" xmlns:s="http://jadedpixel.com/-/spec/shopify"
or like that
Anyone from shopify team here?
These .atom pages are used for feed and supposedly help SEM to get updates for fresh content. But it seems that it is not that much the case now? And google decided to index them…
Quick fix : update robots.txt file like that
we use Shopify as our ecommerce platform
{% for group in robots.default_groups %}
{{- group.user_agent -}}
{% for rule in group.rules %}
{{ rule }}
{% endfor %}
{%- if group.user_agent.value == ‘’ -%}
{{ 'Disallow: /?q=’ }}
{{ 'Disallow: /?filter*’ }}
{{ ‘Disallow: //sandbox/’ }}
{{ ‘Disallow: /.atom’ }}
{{ 'Disallow: /.oembed’ }}
{%- endif -%}
{%- if group.sitemap != blank -%}
{{ group.sitemap }}
{%- endif -%}
{% endfor %}
This should help but the problem is when these atom pages are already indexed. One option is to wait that google desindex by himself these now blocked by robots pages but not sure how efficient it is. We can also manualy ask for noindex in the GSC but that’s a pain if we have a lot of pages with this issue.
The best option is to put these atom page in noindex as shown at the begining if shopify is kind enough to do it. Or if anybody is able to help to inject the noindex on these lines. Anyone please?