Robots.txt removing several default rules.

Jens17
New Member
1 0 0

I want to allow both " User-agent: AhrefsBot" and "User-agent: AhrefsSiteAudit" to roam all the pages on my site for SEO purposes.

However they are blocked by robots.txt. I don't want to completly delete robots.txt, as I still want it to block the other stuff and be there for any future updates.

So I've tried to create an "Unless" rule using this guide:

https://shopify.dev/tutorials/customize-theme-customize-robots-txt-liquid?shpxid=33535f7f-1F62-47B0-... 

I'm not that familiar with the liquid language, so instead of making an exempt to the user-agents, i've tried to make it for the pages, as it is the same pages for both bots.

How ever, I've only been able to unblock 1 page at a time, even using and/or commands, it still only unblocks the last page.

Is there an easy way to allow these to roam my site, or to unblock several pages?

 

0 Likes
KieranR
Shopify Partner
298 24 90

You could do this to completely remove the Ahrefs groups: 

KieranR_1-1626064749941.png

Though I'm not sure I understand what you mean by "for SEO purposes."

I haven't needed to adjust robots.txt to allow Ahrefs on any sites so far.

 

Full time Shopify SEO guy, based in NZ. Sometimes freelance outside the 9-5.