We're using SEMrush for some SEO work with Shopify accounts and we're getting a lot of 429 errors, which I assume is Shopify throttling access to the robot.
SEMrush allow you to override the robot settings, but this requires access to robots.txt, which Shopify doesn't allow.
Is there any other way to throttle robot access to avoid these errors?
Hey Bo - that tool should (or at least the last time I looked at it) allow you to set a rate limit from with the app. That won't need access to the robots file.
Hey Jason, that's what I expected as well, but the crawl delay settings only seem to support respecting robots.txt (see attached).
I'll reach out to SEMrush as well to see if a different setting is buried somewhere. Google wasn't much help.
While I wait for a response from SEMrush, it's probably good feedback anyway for Shopify to explicitly set the crawl delay they expect in their robots.txt.
Since we can't configure it oursleves, it would help if Shopify set a rate they considered reasonable to reduce these errors from legitimate audit tools.
An update: It seems like this isn't possible from either end. This is the response I received from SEMrush:
Apologies for the inconvenience. We do understand that Shopify users are not able to edit the contents of the robots.txt file and we hope to come up with a solution to this problem in the future. I wish I could help you more.
Would be great if Shopify limited their crawl rate in their own robots.txt. Without that setting, we have zero control over whether Shopify blocks a legitimate tool or not, since we often can't set the crawl limit at the robot's side.
I am having this same problem with SEMrush, 7,823 new '429' errors in my latest report I just received today! I need to override SEMrushBOT's default behavior by telling it to follow the "crawl-delay" directive in the robots.txt file. But I can't access the robots.txt file. What a conundrum.
Laura, I would recommend putting pressure on SEMrush to manually set a crawl delay. I have asked them to do so, but more voices will increase the chances of it making its way into an update.
As Shopify has access to the robot, the Council thought the bugs will analyze your site, I think the fault lies from where you have not properly configured
I'm new to Shopify, not to SEO. These 429 errors are a bummer; renders the SEMrush site audit tool virtually useless. There's another post on Shopify about this same issue (Why is every page of my store giving 429 errors in semrush.com?). On that post, a Shopify employee said "If you go into your SEMrush.com settings, you can manually set a delay for the bot in SEMrush. When you do that it reduces the amount of 429 errors that will appear." As far as I can tell, this feature doesn't exist in SEMrush. That employee also linked to this API Limit article, which has me concerned that the only way to fix this issue is to upgrade to a Shopify Plus account. Has anyone found a solution to this yet? Thanks guys!