A space to discuss GraphQL queries, mutations, troubleshooting, throttling, and best practices.
Hi,
I am trying to bulk update inventory (nightly run), but running up against a number of road block - the process seems way to laborious. I was hoping someone may have a suggestion.
My store has 26K products. I am need to update the inventory of a subset - 6000 products via a CSV provided daily.
I have looked at GraphQL, and envisaged the following:
This seems like a too many steps to update and queries against sizeable databases. Consider each product has 6 variants = 30K variants to query and update each night. My worry is this will take way to long, risks falling over at some point or getting timed out by the host (something that happened continually during the original product uploads with Shopify API).
If I am missing something, suggestions of a better way would be really appreciated.
Thank you!
Hello!
SKU will be your bane. While SKU is said to be a unique string code, and it typically is in inventory management systems, it isn't within Shopify. And fair enough so, if inventory systems A had an item to sync with Shopify with SKU-123 and inventory system B had another item (not same items) to sync with Shopify by SKU-123, then which item in Shopify's inventory is SKU-123?
So because of that, any kind of inventory item querying by SKU is not possible (BUT it is as I'll explain). REST Admin will allow you to get up to 100 inventory items by ID. GraphQL API will let you query the inventoryItems object by sku but query will limit you to fetching 1 SKU at a time. Or? Well remember that in Shopify SKU does not have to be unique so it is up to you to ensure they are unique. If you can and have, then you can simply do multiple queries in one request.
query { item_1: inventoryItems(first: 1, query: "sku:SKU-123") { edges { node { id inventoryLevels(first: 10) { edges { node { available location { id } } } } } } } item_2: inventoryItems(first: 1, query: "sku:SKU-124") { edges { node { id inventoryLevels(first: 10) { edges { node { available location { id } } } } } } } }
And so on and so forth until your bucket is full - you'd need to experiment how many you can "batch up" like that.
The benefit of above queries is you now not only retrieved your inventory item ID, but you also have the location ID and inventory stock level for that location which comes in handy if you want to do inventoryBulkAdjustQuantityAtLocation as that gem only likes deltas rather than absolute levels. TIP If you ever need to batch get all your levels you can simply pass delta 0 and it will return the current inventory level in the response BUT at constant cost - beats the pants off doing multiple calls or even multiple queries per request.
Well now you calculate your deltas for each inventory item at location and then you call inventoryBulkAdjustQuantityAtLocation.
So in theory, 1 object to query, 1 mutation. How many requests will depend how many of those inventoryItems queries you can squeeze in to 1 request - and I am too lazy to do the math. I wouldn't worry about the bulk adjustment then, that one is a monster and should plough your through quite comfortably.
Now if you read all that thinking, yeah, but our SKUs are NOT unique, well then...
Okay, I do mean that sort of seriously. Though if you'd still need to do it via SKU, then I'd keep a synced lookup table or index of SKU < > Inventory Item in your app where you can go crazy and do your SKU rules whichever way you want, then query a SKU to get your Shopify Inventory Item ID so you can batch update levels without doing 1000's of requests to aggregate all the IDs you'd need.
Hope this helps!
Thanks again Carl.
In your experience, do you see this solution as feasible - i.e. storing a lookup database and running nightly inventory updates on 30,000 variants - or are we on the wrong eCommerce for this type of site?
If it is possible, would you be available, or do you know anyone who can create this type of project?
Best.
Certainly feasible. I wouldn't even limit myself to syncing at night and do trickle syncs all day long rather than batched. The problem with batch anything and API are the 2R/S and 40R burst limits meaning your batch syncs will be heavily throttled and ultimately end up being slower than you'd expect batch operations to be. So just trickle sync constantly.
But I'd rely on any of my "cowboy" forum advice with a pinch of salt and ultimately have someone who will be putting theory to code to commit to a solution that's feasible.
Running into the same issue. It seems there are just too many unique IDs to reference an item. I asked a question originally about using SKU to update a product, which is possible via Graph by not REST. But now we're building the inventory sync part of our integration and see this problem is further exacerbated by the fact that there is yet another unique id, Inventory_Item_Id. So not only does our app now have to store a cross reference table for the product id, the variant id and now the inventory item id.
I get database normalization aspects of why these unique Id's exist based the database table design, but the API should be able to obfuscate those details to the user, just like a tradition user interface does.
Hey @MikeyJ ,
We have a few clients that are using our Airtable sync to accomplish the same objective - bulk update inventory at some predefined interval. Happy to show you how it works in action. Also, check out our guide on how to manage and track Shopify inventory .
hey, Shopify has a bulkInventory adjust quantity endpoint on graphql. You update a JSONL file with all your updates and the ID of locations. All your updates In one call !
https://shopify.dev/api/admin-graphql/2022-10/mutations/inventorybulkadjustquantityatlocation
text me if you need more infos
Hi Tom.
Do you have any experience with this end point? For instance, I'm curious how long it might take to inventory for 100,000 SKUs.
G
Thank you for the quick response Tom.
Our use case has 1.4 million variants spread across hundreds of thousands of products.
We'll need to create all those products / variants first, one time... then maybe 100,000 more to add each year.
However, on a daily basis, there will be pricing and inventory adjustments for an indeterminate number of variants... as this will be controlled by vendor feeds and the pricing whims of the client. I am presuming the potential number of updates any given day could be in the 6 figures... though not always.
I understand about splicing the file to match the file size limit and listening to the webhook... that's manageable... but if Shopify itself would take more than a few hours to process 100,000 inventory / pricing updates... then it is probably not a feasible solution for us.
Our other plan is to just build a custom e-commerce site so our batch operations can be run on bare metal.
G