Shopify support suggested I post on here for some tips on fixing an issue with my store not being crawled properly. bonandbear.com
I understand robots.txt is helpful for pages like cart etc which you might not necessarily want google to crawl, but all my blog pages are key for content SEO and again my product pages are vital.
I have used Ahrefs to run a site crawl and this is where all these issues have been flagged.
I also have my homepage bonandbear.com which repeatedly says it is missing a H1 tag but I have no idea how/where to put this in!
Any help would be welcome for a novice and new business.
The robots.txt is the text file that tells search engine crawlers to not crawl those pages or to prevent those pages from being crawled. Like cart pages, payment pages, etc. And as I can see, your robotx.txt file is fine. It is not blocking any of your products or blog posts, it is just blocking the blog pages which are not necessary to index and similar to the product. None of your products are blocked by robots.txt. As you can see in the given screenshot below,
And Ahref is a tool that works on its own algorithm, so we can’t rely upon the tools as they show the errors according to their machine learning algorithm. We also have to check them manually. And as per our recommendation, there is no page that you need to remove from your robots.txt file.
As per SEO guidelines, every page must have one H1 tag, and as I can see, your home page has an H1 tag. As you can see in the screenshot below, there is 1 H1 tag and 12 H2 tags and 13 H3 tags.
Please note that I have had 2 separate site audits of my Shopify store/site done by Ubersuggest and Ahrefs. Both came back with blocked pages by robots.txt. Ubersuggest (see attached) found only 5, but Ahrefs found 1, 176 blocked pages!!! I also have broken links!!! How do I correct this?!? HELP!!! PLEASE!!!