I want to allow both " User-agent: AhrefsBot" and "User-agent: AhrefsSiteAudit" to roam all the pages on my site for SEO purposes.
However they are blocked by robots.txt. I don't want to completly delete robots.txt, as I still want it to block the other stuff and be there for any future updates.
So I've tried to create an "Unless" rule using this guide:
I'm not that familiar with the liquid language, so instead of making an exempt to the user-agents, i've tried to make it for the pages, as it is the same pages for both bots.
How ever, I've only been able to unblock 1 page at a time, even using and/or commands, it still only unblocks the last page.
Is there an easy way to allow these to roam my site, or to unblock several pages?