We’re trying to query around 500 orders to get customer, address, order line items, and product information for every product ordered. These are all the data points we need to produce a report on orders and products purchased.
Here’s an snippet of what we’re working on right now:
const GET_ORDERS = `query ($cursor: String, $dateRange: String) {
orders(first: 10 after: $cursor query:$dateRange) {
pageInfo {
hasNextPage
}
edges {
cursor
node {
id
name
createdAt
displayFulfillmentStatus
tags
note
shippingAddress {
id
address1
address2
formattedArea
}
lineItems(first: 10) {
edges {
node {
id
name
sku
variant {
id
displayName
weight
weightUnit
}
product {
id
productType
}
quantity
discountedUnitPriceSet {
shopMoney {
amount
currencyCode
}
}
}
}
}
customer {
id
displayName
phone
}
}
}
}
`
const queryString = 'created_at:>2020-10-19 AND created_at:<2020-10-26';
const [hasNextPage, setHasNextPage] = useState(false);
const [cursor, setCursor] = useState(null);
const { data: orderProduceData, loading } = useQuery(
gql(GET_ORDERS_PRODUCE),
{
variables: { cursor, queryString },
notifyOnNetworkStatusChange: true,
onCompleted: data => {
setCursor(getLastCursorInResponse(data));
setHasNextPage(data?.orders.pageInfo.hasNextPage ?? false);
},
}
);
useEffect(() => {
if (hasNextPage && cursor && !loading) {
fetchMore({
variables: { cursor, queryString },
updateQuery: (prev, { fetchMoreResult }) => ({
orders: {
pageInfo: {
...prev?.pageInfo,
...(fetchMoreResult?.pageInfo ?? {}),
},
edges: [...prev?.edges, ...(fetchMoreResult?.edges ?? [])],
},
}),
});
}
}, [hasNextPage, cursor, loading, fetchMore]);
Here are the approaches we’ve tried so far:
-
Paginating order data - using a higher page size hits the limit right off the bat. A lower one is too sluggish since we’re not only waiting for the server to query internally, but also have to figure in the back and forth of the request and the response. Right now, each order requires 47 calculated points and an average of 11 actual points
-
Paginating order and product data separately - this still hits the limit after a few pages
-
Thought of throttling the requests locally (sending requests only every second to allow the limit bucket to replenish itself), but that will result in some more delay
-
Requesting a bulk query operation - it doesn’t matter how small the requests are, the absolute minimum seem to be at around 15 seconds which is still too slow for us
Are there other approaches that are worth looking at to get a complete list of orders faster?
Or perhaps configuration somewhere that helps with this? Something like AWS’ request limit increase