Google Search unable to crawl sitemaps, blocked by robots.txt

Google Search unable to crawl sitemaps, blocked by robots.txt

fad_su
Excursionist
14 0 3

I recently added two markets in my store and also added subfolders in the domain, since them Google Search is unable to crawl my sitemap. It can check the main sitemap but is failing to crawl all sitemaps indexed under it. I am using the default robots.txt templates, no changes made to the template. Below is the error while doing URL inspection. It says "Blocked by robots.txt" but I checked several times, nothing seems to be blocking.

 

fad_su_0-1743754937827.png

Any suggestions what the issue might be?

Replies 3 (3)

louis_digi
Visitor
2 0 0

Sounds like your robots.txt might be unintentionally blocking the new subfolders. Even default templates can cause issues when new directories like /en/ or /de/ are added. Here's what to check:

  1. Review robots.txt – Make sure it doesn’t include Disallow: /en/ or similar lines that block subfolders.

  2. List all sitemaps – Ensure your robots.txt includes full sitemap URLs like:

    arduino
    CopyEdit
    Sitemap: https://yourdomain.com/en/sitemap.xml
  3. Use GSC's Robots.txt Tester – It’ll show if a specific URL is being blocked and by which rule.

  4. Test sitemap URLs – Open them directly to make sure they return a 200 status and are publicly accessible.

  5. Manually resubmit subfolder sitemaps – Go to Search Console > Sitemaps and submit the new ones.

If all looks good and it still says "Blocked by robots.txt," feel free to share a sample URL and your robots.txt so the community can help debug further.

fad_su
Excursionist
14 0 3

Everything looks fine to me. The default sitemap is an index sitemap that includes other sitemaps, below is the screenshot.

fad_su_0-1743773073269.png

When I am trying to fetch these individual sitemaps that are inside the main sitemap via GSC's URL Inspection it is giving me "Blocked by robots.txt" error. I am able to open all these sitemaps in browser but somehow they are blocked from Google.

 

Here's robots.txt

fad_su
Excursionist
14 0 3

@Ternence Google Search is able to crawl the sitemap now but the issue persists with ahrefs, the crawls keep on failing. I used to get a lot of SEO  improvement suggestions from ahrefs but not anymore. It's frustrating!