Open AI Integration - "Answer question about table" error message

Hi :wave:

I’m building an app in Glide and using the Open AI integration to utilize the “answer question about a table” action and I keep getting error messages that say, “Answer question about a table: This model’s maximum context length is 4097 tokens, however you requested 6914 tokens (6659 in your prompt; 255 for the completion). Please reduce your prompt; or completion length.”

I’ve tried multiple troubleshooting steps like adjusting the source tables, changing the temperature, adjusting prompt size, adjusting prompt, adding a new API key and nothing seems to work. Also, even if I delete my previous actions, it seems that the tokens aren’t being cleared after an attempt and are added to the tokens for the current run.

Whenever I seem to use the other Open AI integration actions, I don’t have this problem. It’s been pretty consistent for over a week now, I think. Any thoughts?

In general that’s saying you need to reduce the overall character count going to their service. Try reducing the number of columns or the content inside each column passed from the source table to get it under the token count. Worse case if you have a big essay for instance, you could ask to summarize the essay column in a new summarized column and then point the source table columns to to this summary.

Hey @Jeremy! Thanks for your suggestion. I tried it but it didn’t work. :frowning:

I summarized the column I was looking to use AI on then I deselected the other columns. I got a new error message about the embeddings, which is the same thing this post is about. So I only grabbed 12 rows of data and created a new column. Then I updated the Action to get the new column, but still see the same error message. I’m not processing much data at all so I’m not sure how it could be that I’m hitting limits (I doubled checked my OpenAI account and I’m not hitting any limits there either).

Is there a chance your question is too long? 6659 is a lot anyway.