The robots.txt file tells Google what it is alowed to crawl. If Google can't access it, then it will stop crawling as it does not know what not to crawl.
It looks fine to me now, so I presume it was a temporary issue.
Google should check it again soon, see it is fine, and contine on its merry crawling.
Is it possible your storefront was locked at the time the crawl happened? That could explain the inaccessible robots file.
Looks like there's no lock on the store at the moment so I am able to see your sitemap without issue.