Robots.txt is blocking all blog posts from being crawled

ScreenBlues
New Member
1 0 2

The shopify robots.txt file (uneditable by users) is disallowing all blogs from being crawled by Google. This makes it impossible to do any content marketing as no blogs will ever rank or be indexed. Looking for a solution.

 

From my robots.txt:

 

User-agent: *
Disallow: /admin
Disallow: /cart
Disallow: /orders
Disallow: /checkout
Disallow: /26379681876/checkouts
Disallow: /26379681876/orders
Disallow: /carts
Disallow: /account
Disallow: /collections/*+*
Disallow: /collections/*%2B*
Disallow: /collections/*%2b*
Disallow: /*/collections/*+*
Disallow: /*/collections/*%2B*
Disallow: /*/collections/*%2b*
Disallow: /blogs/*+*
Disallow: /blogs/*%2B*
Disallow: /blogs/*%2b*
Disallow: /*/blogs/*+*
Disallow: /*/blogs/*%2B*
Disallow: /*/blogs/*%2b*
Disallow: /*design_theme_id*
Disallow: /*preview_theme_id*
Disallow: /*preview_script_id*
Disallow: /policies/
Disallow: /search
Disallow: /apple-app-site-association
Sitemap: https://screenblues.com/sitemap.xml

 

snde
New Member
4 0 0

Same issue here as well.

I read on Google's forum that, that a more stringent setting such as "noindex, nofollow" will overwrite a less stringent setting such as "index, follow". If that's the case and the default Shopify setting is "noindex, nofollow" for sections (such as blogs), any code insertion at theme.liquid (such as below) will be ignored. Is that right?

{% if template contains 'search' %}
<meta name="robots" content="noindex">
{% endif %}
0 Likes