It is indeed related to crawl rate - I've tried this today with Screaming Frog.
Go to configuration > Speed and set the Max Threads to 1.
I tried the Max Threads set to 2, but the errors start populating even at 2 threads.
Enjoy your painfully slow crawl! A giant pain in the ass for an SEO audit.
Got a great explanation about this from Tier 2 tech support at Shopify about the many 430 (server run-time errors) showing in my SEOProfiler.com bot crawl of my shopify store ( SEOProlifer uses a bot which crawls and reports all kind of SEO stuff, very nice tool for a one-person shop).
BACKGROUND: About 2 weeks ago I started getting 'error 430, server error) for my site which had been consistently clean. First I asked SEOProfiler to examine the cause of the 430's as I (nor my coder guy) touched anything on the site in the theme etc, only making a few small HTML changes to products and info pages, nothing else changed during the time the 430's started showing up.
THEN, took the issue to Shopify and had a few back and forths with chat level tech support. After the 3rd round I asked to be bumped up (escalated) to next level of tech support. Within 24 hours here's what they explained to me:
Jun 4, 22:52 EDT
I'm Emily, from Shopify's technical support team. Beth let me know about the problem you're having with 430 errors when you use SEO Profiler. I've looked into this for you and I have some more information about it.
The 430 error is happening because you're getting too much traffic on your site from the same IP, much too quickly, so we're blocking it temporarily for security reasons. It makes a lot of sense that you could see this error when you're using a service like that to check your site, because that is exactly what is happening: a ton of requests are coming through to your site, very quickly, all from the same IP.
The good news is, that error does not indicate there are any SEO problems with your shop. It also does not affect your customers' shopping experience in any way.
To avoid getting that error, you'll need to run the test more slowly.
I hope this helps. Please let me know if you have any other questions or concerns about this.
Tier 2 Support
So, my next step is to share this email with SEOProfiler and hope they can adjust the speed of their crawls. Anyway, since this maddening problem cost me alot of time (but fortunately ended good in that there were no real problems with my site coding) thought I'd lay this out here for others.
Follow up: I changed the delay time in SEOProfiler bot to 30seconds. No 430s after that. Just figure out how many pages total in your site and calculate so that all pages (or whatever limit you have, mine's 500) can be crawled within a 24 hr period. Also, it helps to use the google webmaster feature and you can see google's errors during their routine crawl of your site. a bit redundant but it's good to have the 2 bot results to compare imo.
I just commented on a separate post with two different solutions to this problem. Here's a direct link to the solution...
Hope this helps!