I have an uptick of bots crawling my website right now. It’s mostly my blog pages and they stay from anywhere between 3 seconds to 7 seconds. How do I prevent them from accessing my site? I’ve just started producing content and want to get a real gauge on my stores activity instead of this fake nonsense.
All websites are publicly available for anyone to view (if the storefront password is disabled). Unfortunately, there’s not really any way for you to prevent low-quality visitors (ex. bots) from viewing the site.
You cannot stop Bots from skewing your results for Sessions by Location. I confirmed this also with Shopify support. Unfortunately, it has gotten way out of hand by thousands of numbers off. Example: We had over 7k visitors just from Virginia Tapphannock alone and over 4k for Virginia Boydton. Check the Population count for each of those. Our Session count is way up close to 100k more than last year. Just like Shopify has to determine if you are a Human using their site, we should be able to incorporate that as well. Maybe we are not suppose to see how many are really visiting our sites. We may be really dissapointed. Who knows. Just my 2 cents.