Traffic Drop due to robots.txt disavow?

Our web traffic has dropped dramatically since March. We can see on SEMRush that over 60 new issues were discovered on March 5th that shows blocked internal resources from the robots.txt. Below is one small section of our current robots.txt file. Is this normal? Could this be causing the drop in traffic?

User-agent: *
Disallow: /a/downloads/-/*
Disallow: /admin
Disallow: /cart
Disallow: /orders
Disallow: /checkouts/
Disallow: /checkout
Disallow: /41189769374/checkouts
Disallow: /41189769374/orders
Disallow: /carts
Disallow: /account
Disallow: /collections/*sort_by*
Disallow: /*/collections/*sort_by*
Disallow: /collections/*+*
Disallow: /collections/*%2B*
Disallow: /collections/*%2b*
Disallow: /*/collections/*+*
Disallow: /*/collections/*%2B*
Disallow: /*/collections/*%2b*
Disallow: */collections/*filter*&*filter*
Disallow: /blogs/*+*
Disallow: /blogs/*%2B*
Disallow: /blogs/*%2b*
Disallow: /*/blogs/*+*
Disallow: /*/blogs/*%2B*
Disallow: /*/blogs/*%2b*
Disallow: /*?*oseid=*
Disallow: /*preview_theme_id*
Disallow: /*preview_script_id*
Disallow: /policies/
Disallow: /*/policies/
Disallow: /*/*?*ls=*&ls=*
Disallow: /*/*?*ls%3D*%3Fls%3D*
Disallow: /*/*?*ls%3d*%3fls%3d*
Disallow: /search
Disallow: /apple-app-site-association
Disallow: /.well-known/shopify/monorail
Disallow: /cdn/wpm/*.js
Disallow: /services/login_with_shop
Disallow: /recommendations/products
Disallow: /*/recommendations/products
Sitemap: https://themillworkoutlet.com/sitemap.xml
1 Like

Yes, the robots.txt file could be causing the drop in traffic if it’s blocking important internal resources. Reviewing and potentially revising the robots.txt file to ensure it’s not inadvertently blocking critical pages or resources that should be indexed by search engines might help resolve the issue.