I have the same problem, does somebody have an answer ? (seem to be S3 limitation)
The GraphQL Bulk Operations import API has a 20 megabyte file size limit for bulk mutations, which causes broken pipe errors when attempting to upload larger files (e.g., ~36MB).
Workaround Solution:
Performance Example:
Updating ~12k product variants (price, cost, description, tags, weight, metafield) across 2 bulk operations (~21.2MB total) took approximately 11 minutes to complete.
Outstanding Issue:
Users question how to handle large datasets without splitting into multiple bulk mutation operations. The documentation will be updated to clarify this limitation more explicitly.
I have the same problem, does somebody have an answer ? (seem to be S3 limitation)