GraphQL Admin API: Bulk Query Producing Duplicate Orders (same gid)

Highlighted
New Member
1 0 1

Hi,

I'm using the GraphQL Admin API to perform a bulk query on unfulfilled orders over a date range and the result consistently contains two orders with the same ID, one after another. Is this intended behavior? I can work around it but I felt it might be important to report. Here's some additional information about my bulk query:

The GraphQL query looks like this:

{
  orders(query: "created_at:>={start_date} created_at<=:{end_date} fulfillment_status:null") {
   ...
} }

where {start_date} and {end_date} are the respective ends of the date range and ... is the body of the query. The result looks like this:

...
{"__typename":"Order","id":"gid:\/\/shopify\/Order\/0000","name":"#9999"} {"__typename":"LineItem","id":"gid:\/\/shopify\/LineItem\/1111","quantity":1,"__parentId":"gid:\/\/shopify\/Order\/0000"} {"__typename":"Order","id":"gid:\/\/shopify\/Order\/0000","name":"#9999"} {"__typename":"LineItem","id":"gid:\/\/shopify\/LineItem\/1111","quantity":1,"__parentId":"gid:\/\/shopify\/Order\/0000"}
....

I've changed the actual IDs but they are duplicated in the same way in the JSONL file from Shopify. As you can see above I also request the line items from the order and those are duplicated as well.

Let me know if you would like any more information to replicate this issue.

Thanks.

1 Like
Highlighted
Shopify Staff
Shopify Staff
1041 140 166

Hey @jamespackard 

Thanks for reporting this. Are you able to DM me the store details + exact query?

0 Likes
Highlighted

When looping through the return results, I'm finding a lot of duplicate orders as well. The following is my query:

query ($num: Int!, $cursor: String) {
  orders(first: $num, after: $cursor, sortKey: CREATED_AT, query: "created_at:>2020-06-01 and created_at:<2020-07-31 and financial_status:paid") {
    pageInfo {
      hasNextPage
    }
    edges {
      cursor
      node {
        name
        id
        createdAt
        updatedAt
        displayFinancialStatus
      }
    }
  }
}

 

CSV Output:

web_order_id,order_id,created_at,updated_at,financial_status

Line 8: 950352639,2372522967102,2020-06-01T14:16:12Z,2020-06-01T14:16:24Z,PAID

Line 257: 950352639,2372522967102,2020-06-01T14:16:12Z,2020-06-01T14:16:24Z,PAID

[Repeats 5+ more times]

 

Code:

def execute_query(self, query, variables=None):
        try:
            logger.debug(f"Executing query with vars {variables}")
            results = shopify.GraphQL().execute(query, variables)
            return OmegaConf.create(json.loads(results))
        except Exception as ex:
            logger.debug(ex)

    def run_report(self):
        start = time.time()
        logger.debug(f"Starting report export || Pulling {config.shopify.query_item_limit} orders per batch")
        with open(config.graph_queries.reports) as gql:
            query = gql.read()

            with open('fgt_orders.csv', 'w') as csvfile:
                header = ['web_order_id', 'order_id', 'created_at', 'updated_at', 'financial_status']
                writer = csv.DictWriter(csvfile, fieldnames=header)
                writer.writeheader()

                varis = { "num": config.shopify.query_item_limit }

                start_batch = time.time()
                parsed = self.execute_query(query,variables=varis)
                batch_num = 1
                logger.debug(f"Working batch {batch_num} ||Cursor:{ parsed.data.orders.edges[0].cursor }")
                
                for order in parsed.data.orders.edges:
                    writer.writerow(self._process_row(order))

                end_batch = time.time()
                batch_elapsed = round((end_batch - start_batch),2)
                total_elapsed = round((time.time() - start), 2)
                
                hasNextPage = parsed.data.orders.pageInfo.hasNextPage
                cursor = parsed.data.orders.edges[0].cursor

                logger.debug(f"Finished batch {batch_num} [{batch_elapsed}] [{total_elapsed}]|| Next page:{parsed.data.orders.pageInfo.hasNextPage} || first id: {parsed.data.orders.edges[0].node.name} || last id: {parsed.data.orders.edges[len(parsed.data.orders.edges) - 1].node.name} ")
                
                while hasNextPage is True:
                    start_batch = time.time()
                    time.sleep(config.shopify.throttle_time)
                    batch_num = batch_num + 1
                    n_varis = { "num": config.shopify.query_item_limit, 'cursor': cursor }
                    n_parsed = self.execute_query(query,variables=n_varis)
                    hasNextPage = n_parsed.data.orders.pageInfo.hasNextPage
                    cursor = n_parsed.data.orders.edges[0].cursor

                    logger.debug(f"Working batch {batch_num} ||Cursor:{ n_parsed.data.orders.edges[0].cursor }")
                    
                    for n_order in n_parsed.data.orders.edges:
                        writer.writerow(self._process_row(n_order))

                    end_batch = time.time()
                    batch_elapsed = round((end_batch - start_batch),2)
                    total_elapsed = round((time.time() - start), 2)
                    logger.debug(f"Finished batch {batch_num} [{batch_elapsed}] [{total_elapsed}]|| Next page:{hasNextPage} || first id: {n_parsed.data.orders.edges[0].node.name} || last id: {n_parsed.data.orders.edges[len(n_parsed.data.orders.edges) - 1].node.name} ")

                csvfile.close()
            gql.close()
            end = time.time()
            logger.debug(f"Completed report generation: {batch_num} [{round((end - start),2)}")

    def _process_row(self,order):
        return { "web_order_id": order.node.name, "order_id": order.node.id.replace('gid://shopify/Order/', ''), "created_at": order.node.createdAt, "updated_at": order.node.updatedAt, "financial_status": order.node.displayFinancialStatus }

 

I've verified this is the case with Postman as well. In each case, I've tested that the cursor is correct.

Thanks!

0 Likes
Highlighted
Shopify Staff
Shopify Staff
1041 140 166

Thanks @Bill_FGT - DMing for more information.

0 Likes