Robots.txt blocking over 300 pages including blog and products - how do I remove safely?

Hi,

Shopify support suggested I post on here for some tips on fixing an issue with my store not being crawled properly. bonandbear.com

I understand robots.txt is helpful for pages like cart etc which you might not necessarily want google to crawl, but all my blog pages are key for content SEO and again my product pages are vital.

I have used Ahrefs to run a site crawl and this is where all these issues have been flagged.

I also have my homepage bonandbear.com which repeatedly says it is missing a H1 tag but I have no idea how/where to put this in!

Any help would be welcome for a novice and new business.

Thanks

Hi

Hope you are doing well

The robots.txt is the text file that tells search engine crawlers to not crawl those pages or to prevent those pages from being crawled. Like cart pages, payment pages, etc. And as I can see, your robotx.txt file is fine. It is not blocking any of your products or blog posts, it is just blocking the blog pages which are not necessary to index and similar to the product. None of your products are blocked by robots.txt. As you can see in the given screenshot below,

Here, only the collection pages and blog pages are blocked, not the blog posts and products.

And you can also see your blogs are indexed on Google in the given below screenshot.

And Ahref is a tool that works on its own algorithm, so we can’t rely upon the tools as they show the errors according to their machine learning algorithm. We also have to check them manually. And as per our recommendation, there is no page that you need to remove from your robots.txt file.

As per SEO guidelines, every page must have one H1 tag, and as I can see, your home page has an H1 tag. As you can see in the screenshot below, there is 1 H1 tag and 12 H2 tags and 13 H3 tags.

But on your shop page, there is no H1 tag so our recommendation is to add an H1 tag on that page here is the page URL

https://bonandbear.com/collections/shop

Hello Astha,

Please note that I have had 2 separate site audits of my Shopify store/site done by Ubersuggest and Ahrefs. Both came back with blocked pages by robots.txt. Ubersuggest (see attached) found only 5, but Ahrefs found 1, 176 blocked pages!!! I also have broken links!!! How do I correct this?!? HELP!!! PLEASE!!!