Have your say in Community Polls: What was/is your greatest motivation to start your own business?
Our Partner & Developer boards on the community are moving to a brand new home: the .dev community forums! While you can still access past discussions here, for all your future app and storefront building questions, head over to the new forums.

Shopify Webhook Requests Swamped My App

Shopify Webhook Requests Swamped My App

Jason_Buehler
Excursionist
12 0 4

Earlier today we were receiving upwards of 500rpm that were webhook requests, coming from one store and I assume coming from one installed app making sweeping changes to variants. This effectively DOS'd our app, taking it offline for a period of time. This is frustrating, but I don't really have a good idea about how I can handle this. Has anyone ran into this problem? If so, what did you do about it? I wonder if making a separate application that just handles webhooks would be the solution, but that seems heavy handed. 

Thanks for your suggestions in advance, 

Jason

Replies 42 (42)

Robert_Banh
Shopify Partner
11 0 1

Hi Jason, there are many solutions and they vary depending on complexity. You can setup a queue so all webhooks get added into the queue. Then another job that would process the items in the queue. Also checkout https://www.iron.io/

Robert_Banh
Shopify Partner
11 0 1

Hi Jason, there are many solutions and they vary depending on complexity. You can setup a queue so all webhooks get added into the queue. Then another job that would process the items in the queue. Also checkout https://www.iron.io/

Bayuo_B__Blaise
Shopify Partner
3 0 0

I use to have similar issue, i had to changed my webhook processing implemenation to handle rapid webhook request.

Jason_Buehler
Excursionist
12 0 4

Ya, that's what we decided to do. Basically unsubscribe shops if we receive a certain number of requests at a certain rate, and then allow the the shop to reenable them and reingest their products after the fact. Kinda crummy though that Shopify rate limits the api but doesn't allow you to set rate limits on the webhook requests that they send your app. 

HunkyBill
Shopify Partner
4853 60 568

You can re-architect things so that you just accept and validate the webhook, and throw it in a queue to work on later. Quickly return 200 OK and you're in better shape. 500 rpm is ~8 req/sec which is peanuts for the most part. 

If you miss a request when too busy, no worries, Shopify resends it. Many times. 

So while getting hammered seems like trouble, with the modern fast Queues out there today (Redis, RabbitMQ, Beanstalkd. et), you should be able to work without doing crazy things like unsubscribing the shop...  

Custom Shopify Apps built just for you! hunkybill@gmail.com http://www.resistorsoftware.com

Jason_Buehler
Excursionist
12 0 4

HunkyBill, I guess I should have stated that this is an app that is a sort of trial run (growing the infrastructure as we can do so within our earnings) so we don't have it on a full production stack as of yet. Currently we are on hobby instances on heroku which limits you to 1 per role so you can't scale horizontally. The app already has background job queueing in place, so we are grabbing them/queueing them up for later already, and has 3 puma server instances handling requests, but the rate that Shopify sends them is too fast for a small stack like this, regardless of how you architect it. So, I guess I was kinda surprised that the bar to entry is probably 4 standard 1x heroku web instances with a standard 1x queueing dyno and a db to match, so probably $150/ish/month minimum for hosting, which is all pretty much due to Shopify DOSing smaller setups. I am really just looking for clever suggestions about how people have handled this on a small stack like this. 

HunkyBill
Shopify Partner
4853 60 568

Hi Jason,

4 Dynos for handling 8 req/sec??? I am using one and handling ~ 10... no trouble. 

I get away with 1 Dyno for WWW, 1 for a worker, and a DB, so my costs are ~ 100 on a busy App. Go from there... agree though, for an App with no income, $100/month investments seems steep... cost of doing business. 

 

Custom Shopify Apps built just for you! hunkybill@gmail.com http://www.resistorsoftware.com

Jason_Buehler
Excursionist
12 0 4

Thanks for the info. Good to know where we can get. I don't think we are far off then, maybe we just had a spike that caused our request queue to back up or I may have actually had too many puma web workers and put the dyno into swap. Either way thanks for the insight into what we can achieve. I will look for ways to tune our current stack. Just out of curiousity, I assume you aren't running hobby sized dynos since you are at around a hundo a month, do you have autoscaling enabled on your app for your www role and if so does it help handle your spikes? As far as I know standard 1x's and hobby dynos are the same minus the horizontal scalability abilities of the standard 1x. 

HunkyBill
Shopify Partner
4853 60 568

I am not running auto-scaling, but I have it in my pocket if needed. I have had crazy issues with that in the past 5 years or so, where auto-scaling cascaded into very rich territory on errors. Yeesh-ville. I just keep glancing at the response rates and seeing them at or below 300ms makes me happy. Spikes in webhooks as you know, are a PITA.

One happy workaround. Product/Update is a source of spikes right. Some other App can generate tons of them. What to do? Don't listen to product update. Instead, schedule a job to download products with an updated_at_min to the last time you checked it. You'll only receive products actually updated in that time frame, once. Hence no overloading. 

Custom Shopify Apps built just for you! hunkybill@gmail.com http://www.resistorsoftware.com

jcrowe
New Member
8 0 0

Have you thought about using a 3rd party queue system that can handle the overload of traffic and letting your app just poll for jobs periodically? Most of those sytems handle spikes in traffic and have extremely high uptimes for very little cost. It's not an ideal solution as I agree that shopify should not be DOSing our servers but it could be a cost-effective workaround. I've used iron.io succesfully but I know that there are a ton of good ones out there

Michaël_G_
Shopify Partner
74 0 59

Hi,

What we have done here is that we have created a super lightweight micro-service that only listens to webhook and add elements into a queue. This way at worst only this micro-service is DDos (which is very small and optimized and therefore most requests are executed in less than 20ms).

Zac12
Shopify Partner
65 0 20

I had a similar issue with webhooks. Using Laravel, I've created a queue which handles incoming webhook requests coming from each site and if it's received a request from that store in the last ~15-30 seconds for that webhook, then it ignores them. Works well when someone is making mass changes to products and recieving `products/update` a lot.

Zyber Developer | https://apps.shopify.com/trademe | https://apps.shopify.com/productfilter | https://apps.shopify.com/splittest

HunkyBill
Shopify Partner
4853 60 568

Ignoring webhooks is not ideal when the payload they contain is actually important. You can only deduce that if you inspect the payload. Ignoring webhooks is therefore a poor substitute for avoiding dupes. 

This remains a hard cloud computing skill to master. 

Custom Shopify Apps built just for you! hunkybill@gmail.com http://www.resistorsoftware.com

Jason_Buehler
Excursionist
12 0 4

I love the idea of a microservice that handles these webhooks. If I can find some time I will probably go that road, as like HunkyBill said, they are being sent for a reason so if you are listening for one you pretty much probably need to look at them all, which makes a service that's life's job is waiting for and processing webhook request/responses ideal.

In the interim we basically made a throttle mechanism that unsubscribed a shop from the products/update topic that triggers a flag and gives the shop a notice that their data might be stale and allows them to reingest all of their products and reenable the product_update topic when they go to look at a view that needs that data. It seems to have alleviated the DOS problems pretty well, and we haven't heard from our customers that they are being put out by it. A dedicated service for ingesting and processing them though really would be ideal, especially if we could make it service multiple apps. 

 

Zac12
Shopify Partner
65 0 20

@HunkyBill Agreed. My circumstance is a bit different, due to what a webhook request is actually doing in my application. Not so much "here is some new data to record" but more of a "there is more data available to record. Queue some time later to check it out". My controller has a check in it which sees if there is a request already in the queue linked to that store with that webhook, and due to the job not being run yet, it clears this new one. There's no point in two jobs in the queue that are going to do the same thing and get the same data.

Zyber Developer | https://apps.shopify.com/trademe | https://apps.shopify.com/productfilter | https://apps.shopify.com/splittest

swymdev
Shopify Partner
12 0 1

The other question to ask is if you REALLY need to be on Heroku - fully understand thats a decision that has broad implications but is probably best to ask that question when you are early stage. Your approach of being conservative and scaling with need totally resonates with me and makes a ton of sense. However, our experience was that Heroku wasn't the most ideal stack to explore that approach with. Worked really well from a convenience, supportability and ease-of-deployment perspective for our pilot/beta scenarios, but with production loads, the costs tend to go up very quickly. We eventually ended up getting off Heroku and rebuilding our deployment from the ground up on AWS and that's worked much, much better from the standpoint of keeping costs reasonable and making it easier to scale

Helping you maximize engagement with your repeat visitors
Our Solutions: Wishlist+| In Stock Alerts | Triggered Emails | Shopping Assistant

Jason_Buehler
Excursionist
12 0 4

This is a good point. We have managed to stay pretty cheap on our app so far being clever but when we need the horizontal scalability it is going to get expensive quickly like you mention. At some point the cost benefit of moving to a cheaper but less managed service will be a good move, we aren't there yet though. It's a lot of overhead to manage infrastructure on aws compared to heroku but it is miles ahead in cost for any sizable stack.

Zuby1
Shopify Partner
9 0 1

I recently had a similar issue (not as many as 500rpm) but I saw a lot of webhooks hitting my app in succession, I realised that alot of these were duplicate requests coming from Shopify for the exact same request. 

What I did here was create a persistent cache in my webhook handler, so if the same request came in within a minute of the previous, it would immediately return a HTTP 200 to Shopify and then process the request(s). This de-duped a surprisingly large amount of the requests as they were  merely dupes being hammered into my app.

Hope that helps somewhat

Kind Regards Zuby

HunkyBill
Shopify Partner
4853 60 568

I tried the cache approach too, but ended up disabling it. It has the same issues of race conditions. Imagine your App is beyond tinker toy and runs on a few different threads on a few different processors. When you get one webhook and set cache, there is no guarantee the other threads won't have seen the other dupe webhook come in, and thus not see the same cache key. So they try and set the same key. With 2 webhooks hitting within ms of each other, it happens. I tried the CAS strategy but again, cache setup was too slow. I don't see cache as the problem, but instead the speed at which you can set the cache keys and ensure they are global to all threads on all processors, means some hits are dupes... 

But ya... it can cut down on anything reasonable like 300ms apart or more... one would hope.

 

 

 

Custom Shopify Apps built just for you! hunkybill@gmail.com http://www.resistorsoftware.com

Zuby1
Shopify Partner
9 0 1

As long as your app runs on the same webserver, threading shouldn't be an issue. I had my cache persist above the request level, and manually created a cache key from the webhook type (i.e. OrderUpdated, AppUninstalled etc) and added the shopify order id or hashed store url as a concatenated key.

My cache would simply be 'OrderUpdated-123123124124123213' and I'd not save a body, for any dupe requests i see for the same webhook type and order number for example, i'd immediately return a 200. 

This seemed to work fine across threads for me. Keep your cache in memory and have it persist only for 30-60 seconds. That would be fast enough, no?

Kind Regards Zuby

Jim_Kane
Shopify Partner
5 0 2

Reading through this, I think another option would be to apply backpressure to Shopify in a Rack middleware. A previous poster noted that Shopify will retry their requests, so if you have a rack middleware that checks the queue and returns a 503 (Service Unavailable) when the queue passes a high-water mark, that should smooth out the big spikes in incoming traffic. I assume that you are using a Rack-based framework b/c of your mention of Puma. You might get some use out of https://github.com/dryruby/rack-throttle as well, as a simpler (rate-based) alternative.

HunkyBill
Shopify Partner
4853 60 568

@Jim,

The problem here is Webhooks are a one-way mechanism in the sense that anything other than a 200 OK response means Shopify invokes their countdown mechanism to Webhook removal. So when you return anything else, eg: 503, you risk terminating that Webhook. Not what you want. 

I return a joke 498 when I actually want Shopify to remove my Webhook. The funny thing about the App game is sometimes you play it wrong. For example, I removed a store from my App without first removing the Webhooks. Now that store is a total basket case Zombie. I get Webhooks from it, but it's not subscribed to the App. And there is no way for me to manually kill it. 

But ya... would be nice if we could simply not get hammered with the same Webhook within mere milliseconds. 

Custom Shopify Apps built just for you! hunkybill@gmail.com http://www.resistorsoftware.com

Nick_Sweeting
Tourist
10 0 2

We handle 1500 rpm from webhooks, and have a 5-6 ms turnaround. Do your authentication, then dump it to a job queue. If you're limited with Redis memory, you may need to parse the webhook to extract the data you need - otherwise you may get massive RAM spikes.

HunkyBill
Shopify Partner
4853 60 568

@Nick that is some short turn-around time to respond to a request, authenticate it, dish it off to a queue and respond with 200 OK... 5-6ms... Outstanding! But like you say, zero processing is going on and you're gobbling up memory with those requests when dished off to Redis. Peanuts though when you're sitting on a machine with few GB to play with.

Custom Shopify Apps built just for you! hunkybill@gmail.com http://www.resistorsoftware.com

Pogodan
Shopify Partner
76 0 13

We've been running into this issue recently, and our new solution is to use a specialized Phoenix/Elixir webhook handler. We may wind up porting more of our apps to Elixir over time, but for the moment all this does is:

  • save the webhook data to the Postgres DB
  • queue a background job in Redis

We use Exq for Sidekiq compatibility and so the rest of the webhook processing occurs in the Rails app as before

Here's example benchmark (wrk with real-world JSON webhook used):

  12 threads and 50 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency   112.16ms   34.88ms 264.20ms   67.73%
    Req/Sec    35.67     12.01    80.00     66.31%
  12837 requests in 30.09s, 3.12MB read
Requests/sec:    426.58
Transfer/sec:    106.29KB


 

Pogodan | https://experts.shopify.com/pogodan-dev

HunkyBill
Shopify Partner
4853 60 568

426 requests per second. Wow. That is some kind of App. Blitzed!

Custom Shopify Apps built just for you! hunkybill@gmail.com http://www.resistorsoftware.com

Mike_Potter
Shopify Partner
83 0 15

Hi all. We just wrote a blog post about how we handle webhooks in a "serverless" way with AWS API Gateway, Lambda, SQS and ElasticBeanstalk workers. You can read about it at https://medium.com/@mike_potter/handle-shopify-webhooks-without-a-server-ccd2ada62ece

We've also open sourced our Ruby code for the workers so you can get up and running easily.

Mike

Co-Founder, Rewind (Backups for Shopify). Apps: Rewind, Shopify Staging (setup test and staging Shopify stores).

Jim_Kane
Shopify Partner
5 0 2

Pogodan, that's an interesting solution. If you decide to move more processing to elixir, take a look at

https://github.com/Boulevard/shopify

I have contributed a few fixes to it for a small side project.

Pogodan
Shopify Partner
76 0 13

Hey Jim thanks for the comment. I actually saw that lib before and decided to experiment with my own implementation (highly unfinished / not intended for real-world use at the moment), mainly for the sake of using metaprogramming to generate near-full API coverage:

https://github.com/themgt/shopify

I'd absolutely like to coalesce around a single Elixir Shopify client library solution - I think a lot of the Ruby community is beginning to migrate to Elixir.

Pogodan | https://experts.shopify.com/pogodan-dev

Sean57
Shopify Partner
32 0 16

Hey Guys,

I am having a similar issue but it is more the speed at which the requests are coming in and not the amount of requests.  I am using the shopify_app gem to handle the incoming requests and send them to a job.  The job immediately checks to see if the order_id is already in the redis queue and if not, adds it in.  Now this should take place so fast that there should be almost no chance of the same order being processed twice but Shopify manages to send two webhook requests so close to each other that it is too fast for even the redis write / read to catch it.

Here is how quick those requests are coming in:

I, [2016-12-14T09:41:08.412695 #23680]  INFO -- : Received Webhook Call: orders_updated
I, [2016-12-14T09:41:08.413111 #23680]  INFO -- : Received Webhook Call: orders_updated

That is only a .005 second difference.

My code is etremely simple but was wondering if anyone had any insight on what I may be missing:


  def perform(shop_domain:, webhook:)

    # logger = Logger.new("#{Rails.root}/log/order_updated_job.log")
    logger = Logger.new(STDOUT)
    redis = Redis.new(:timeout => 10)
    redis_cache_key = shop_domain.to_s + "-order-" + webhook["id"].to_s
    cached_order = redis.get(redis_cache_key)

    logger.info "Checking redis: " + cached_order.to_s

    if cached_order == "true"
      logger.info "Order (" + webhook["id"].to_s + ") already synced.. not syncing again. Thanks Redis:"
      logger.debug cached_order
      return
    else
      cached_order = redis.set(redis_cache_key, "true");
      redis.expireat(redis_cache_key, Time.now.to_i + 30)           # expire in 30 seconds
    end

 

Pogodan
Shopify Partner
76 0 13

Sean - we've also (especially recently) been seeing these sort of very close duplicate webhooks from Shopify. I'm not sure if it's technically a bug or what, but I think the "solution" is necessarily domain-specific (i.e. to your app/requirements).

 

One thing you could consider would be delaying the job processing (maybe just by a few seconds), so if you say "process_order(123)" delayed for 5 seconds, and just let duplicate jobs come in and get "collapsed" into a single processing task.

Pogodan | https://experts.shopify.com/pogodan-dev

HunkyBill
Shopify Partner
4853 60 568

I was thinking a better algorithm would be to simply dump all the webhooks in your Queue. Whatever that may be. I too am often amused (not in a good way) when this happens... .003, .005, thousandths of seconds between jobs sucks... means they fired off 2 at the same time pretty much...

So in this case, you'd have two identical jobs.. based on the webhook id and the payload. 

So instead of checking the ID in the job perform... don't... the job is meant as nothing more than "stick this crap in your cubby hole, and take your seat and wait... ".

In processing the jobs themselves... you have a separate process that just POPS them off the Queue... and checks then and there if it has seen this one before... since all jobs take some finite time... you could even set your "Done this one" flag at the end of the actual work... 

Of course if you are chewing through jobs with multiple queues you're kinda screwed as this falls apart.. but hey... 

 

Custom Shopify Apps built just for you! hunkybill@gmail.com http://www.resistorsoftware.com

Sean57
Shopify Partner
32 0 16

Pogodan,

I thought about using a delay but then I realized that even with a delay it would still fire them off separately but at the same sitance apart as when they came in.  I even thought about setting a random delay between 1 and 5 seconds to create some separation.  It all seems way to hacky though.

Hunky,

I thought about that as well, having one queue to store the webhook data then another queue to process.  But like you said if you had multiple threads handling the second queue ( which I think you would need in a larger scale app ) then there is still the potential of the duplication happening.

One thing I do have in place is rate limiting on the queue for sidekiq so I am thinking that maybe I can set the rate to 1 every second instead of 5 every 2 seconds which may create a gap in the processing for me.

If that doesn't work I may being doing sleep(rand(5)), even though it seems absolutely silly 🙂

 

HunkyBill
Shopify Partner
4853 60 568

The ability to check a master list off N queues is probably all that is needed. a PITA for sure.. but still.. 

The approach from the rewind.io guys looks so excellent, I am sure the same solution is applicable there too

Custom Shopify Apps built just for you! hunkybill@gmail.com http://www.resistorsoftware.com

Sean57
Shopify Partner
32 0 16

@hunkybill,

What is the rewind.io solution you speak of?  Are you talking about Mike's solution with medium.com?   I love that idea but the app I am building is for a client that does not have a budget for that.  I doubt their app will ever have the traffic to necessitate the AWS setup but it would be fun to implement.

UPDATE: nevermind Bill, just realized that the medium.com guys have rewindit.io

I still think that with the separate queue ( essentially we would have two queues, one for the jobs and one for the processing ) we could run into the same issue unless the queue processor is limited to a single thread.

HunkyBill
Shopify Partner
4853 60 568

Right, but honestly, for most people, one queue processor is probably fine. They have a super warped use-case where getting hammered by webhooks is the nature of the business. 

With one queue to manage things... you can easily run many many jobs in the background with many queue processors... and probably never hit any kind of burp. 

Note too that their solution was dirt cheap. In terms of cloud computing, you would be hard pressed to spend $100/month on it.. which is saying something... so you're dealing with uber cheap clients if they think $100/month is a lot of money... 

Custom Shopify Apps built just for you! hunkybill@gmail.com http://www.resistorsoftware.com

Sean57
Shopify Partner
32 0 16

OK Here is my current solution that seems to be working.

Because I am already using sidekiq / redis and have some rate limiting setup in my sidekiq job I used that to limit the orders_updated job to 1 job per 1 second.  I had to override the webhooks controller that comes with the shopify_app gem and run the jobs myself ( because the gem wants to use perform_later which activejob uses but I cannot use activejob when including sidekiq::worker ).

So now my job class looks like this:

 

class OrdersUpdatedJob
  include Sidekiq::Worker
  require 'unirest'
  require "redis"

  sidekiq_options throttle: {
                    threshold: 1,
                    period: 1.seconds,
                    key: ->(shop_domain, webhook){ shop_domain }
                  },
                  :retry => 5

  sidekiq_retry_in do |count|
    10 * (count + 1) # (i.e. 10, 20, 30, 40, 50...)
  end

  sidekiq_retries_exhausted do |msg|
    Sidekiq.logger.warn "Failed #{msg['class']} with #{msg['args']}: #{msg['error_message']}"
  end

  def perform(shop_domain, webhook)
    ...
  end
end

 

Any my new webhooks controller looks like this:

class WebhooksController < ShopifyApp::WebhooksController
  include ShopifyApp::WebhookVerification

  class ShopifyApp::MissingWebhookJobError < StandardError; end

  def orders_updated
    OrdersUpdatedJob.perform_async(shop_domain, webhook_params)
    head :no_content
  end

  def app_uninstalled
    AppUninstalledJob.perform_async(shop_domain, webhook_params)
    head :no_content
  end

  def receive
    webhook_job_klass.perform_async(shop_domain, webhook_params)
    head :no_content
  end

  private

  def webhook_params
    params.except(:controller, :action, :type)
  end

  def webhook_job_klass
    "#{webhook_type.classify}Job".safe_constantize or raise ShopifyApp::MissingWebhookJobError
  end

  def webhook_type
    params[:type]
  end

end

 

I am sure there is a better way to handle the response and I probably don't need to include most of those functions since they are brought in by inheritence.  Reason why I am not funneling everything to the receive method is because I was getting an error in webhook_job_klass with .classify.   I figured that I could simply create my own methods to not have to worry about it for time-saving reasoons.

Last part to modify was the routes:

Rails.application.routes.draw do
  root 'home#index'
  post '/webhooks/orders_updated', to:  'webhooks#orders_updated'
  post '/webhooks/app_uninstalled', to:  'webhooks#app_uninstalled'
  post '/webhooks/receive', to:  'webhooks#receive'

  mount ShopifyApp::Engine, at: '/'

 

So now my orders_updated job will only run once every second which will ensure that redis has my order_id stored and also the database should have already been updating saying that the order has already been synced.

Note: I am using https://github.com/gevans/sidekiq-throttler for the throttling.. unless you want to pay the $2K a year for the enterprise version of sidekiq you have to use an alternate method such as this one.

 

HunkyBill
Shopify Partner
4853 60 568

Thanks Sean! Nice posting.

Custom Shopify Apps built just for you! hunkybill@gmail.com http://www.resistorsoftware.com

Jason27
Shopify Partner
113 3 44

I have a user that is sending over 20000 product/update webhooks per day, with 4000 products. I have another with 17000 products and only getting 100 per day.

This is nuts, I only want a certain kind of product/update webhooks. How will this improve??

Jason_Buehler
Excursionist
12 0 4

There are some shops that do mass updates and your app receives the giant rush of webhooks, unfortunately it's just part of the game. We ended up making a throttling mechanism that basically removes the product update webhook when they exceed a certain number in a certain time period and then putting that shop on a queue to unthrottle and reingest the products a short time later. Our initial scaling concerns with this were due to writing the webhooks to the db for forensic purposes, since we never used them we got rid of that and it became quite performant after that. We have over a thousand active shops and are happily running on one heroku dyno with this method. We receive the webhook and immediately drop it onto a queue to be processed later, so there is very little overhead since redis is so lightweight to write to. 

I am not sure what framework you are using but with us using rails if I wanted to filter out the ones I didn't want I would write some custom middleware to just reflect the ones that I didn't care about. That way you don't get all the way into the application layer before you realize that it's not relevant and you can save the time for processing another request. 

HunkyBill
Shopify Partner
4853 60 568

Handling massive queues is an awesome skill. Keep the App dead simple, when you get hammered with webhooks dump them in a queue, and chug away at processing that queue nice and steady. 

Not nuts. Basic computing skills. You cannot pass first year Comp Sci without doing projects more complex than that. 

Custom Shopify Apps built just for you! hunkybill@gmail.com http://www.resistorsoftware.com

Mike_Potter
Shopify Partner
83 0 15

The solution I posited above works very well for us. We handle hundreds of thousands of webhooks every day without issue. 

We open sourced all the code to do it yourself and wrote a post on how to set it up at https://medium.com/@mike_potter/handle-shopify-webhooks-without-a-server-ccd2ada62ece?source=linkSha...

 

Mike

Co-Founder, Rewind (Backups for Shopify). Apps: Rewind, Shopify Staging (setup test and staging Shopify stores).