You have dynamic data in the prompt. It won't always be the same length. Looking at your text, it seems like 1999 might be not enough tokens. You could test the output of that prompt by putting the exact same prompt into an email or something like that and counting the words. Or you could put it into OpenAI's API playground tool to test.
Use a email to output it and double-check with the actual text, not the liquid. Most likely one of your variables is outputting more than you expect.
Yes, this is true, not because of the variables but the prompt itself. With 1 example, it works. But still, this is strange. I have tried the same prompt with the same variables in Zapier, and it returns the value from Open AI. Can this be the Flow limit for sending a prompt to Open AI?
Make the shift from discounts to donations, and witness your business not only thrive fina...By Holly Dec 4, 2023
On our Shopify Expert Marketplace, you can find many trusted third party developers and fr...By Arno Nov 27, 2023
You've downloaded the Search & Discovery app from the Shopify App store, and as you're ...By Skye Nov 8, 2023