I am trying to crawl the website of a new e-commerce client on Screaming Frog to complete a full technical audit. It is a relatively large site with over 100k URLs; for future crawls I won’t need to crawl the whole thing but on this occasion I do. As there doesn’t seem to be a way to whitelist my IP on Shopify domains, I set up a new signature and applied it to the Screaming Frog crawl.
However I am continuing to receive 100s of 429 response codes primarily from product URLs (which are key pages I need to analyse), even with reducing the crawl speed and swapping user-agents. I stopped trying to crawl over the weekend but even after 3 days the 429 errors persist.
It’s really important that I am able to audit these pages - is there anything else I can do to stop this happening?