Open AI Integration - "Answer question about table" error message

Hi :wave:

I’m building an app in Glide and using the Open AI integration to utilize the “answer question about a table” action and I keep getting error messages that say, “Answer question about a table: This model’s maximum context length is 4097 tokens, however you requested 6914 tokens (6659 in your prompt; 255 for the completion). Please reduce your prompt; or completion length.”

I’ve tried multiple troubleshooting steps like adjusting the source tables, changing the temperature, adjusting prompt size, adjusting prompt, adding a new API key and nothing seems to work. Also, even if I delete my previous actions, it seems that the tokens aren’t being cleared after an attempt and are added to the tokens for the current run.

Whenever I seem to use the other Open AI integration actions, I don’t have this problem. It’s been pretty consistent for over a week now, I think. Any thoughts?

In general that’s saying you need to reduce the overall character count going to their service. Try reducing the number of columns or the content inside each column passed from the source table to get it under the token count. Worse case if you have a big essay for instance, you could ask to summarize the essay column in a new summarized column and then point the source table columns to to this summary.

Hey @Jeremy! Thanks for your suggestion. I tried it but it didn’t work. :frowning:

I summarized the column I was looking to use AI on then I deselected the other columns. I got a new error message about the embeddings, which is the same thing this post is about. So I only grabbed 12 rows of data and created a new column. Then I updated the Action to get the new column, but still see the same error message. I’m not processing much data at all so I’m not sure how it could be that I’m hitting limits (I doubled checked my OpenAI account and I’m not hitting any limits there either).

Is there a chance your question is too long? 6659 is a lot anyway.

Hello,

I use OPEN AI action…
model: gpt-4o
Works fine with 4096 tokens max length.
But then when i set the actual context window from OPEN AI docs
128,000 tokens it doesn’t work.

  • even if i set maximum answer length to 124000 to take into account my input 1355 tokens + buffer)

Any suggestions?
Is it a Glide limitation? If so, im ready to pay 2 updates to get the full use of OPEN AI Capacity… :wink: even if this double billing wouldn’t be fair as im paying also for my open AI account usage…



Isnt your message showing that youre requesting 129,355 tokens which exceeds gpt-4o’s 128,000 tokens? Guess I dont understand your question.

Also not 100% sure if the OpenAI model “gpt-4o” actually works yet. Havent seen any posts or documentation on that.

Thanks for answering.

To simplify our conversation.

maximum token window gpt-4o (128000)
I request an answer length (124000)
So there is enough for my system prompt + user prompt (1500+1500 = a total 3000 )
plus a buffer (1000 tokens).

I confirm the action works with gpt-4o model selected in Glide, but with only 4096 tokens maximum.
It should be 128.000

Any specialist?
@grumo :wink: Creator of excellent AI QUIZZ template worth trying…

Merci

I’ve never used more than 4096 tokens for all my AI apps including QuizAI.
However, the Maximum Length tooltip text should be updated to say 124000 (or Glide should let us know what’s the actual limit.)

1 Like

Thanks

I actually need less then 8000 tokens myself most of the time …

But maybe up to 80.000 tokens sometimes.
“Write my ebook chapter about xyz” from an audio recording of keywords.

Also maybe in the Glide AI action parameters:

Set maximum tokens used per call

And temperature etc…

Have you tried the Glide AI model? I believe the latest one points to gpt-4o. Also, what sentences/terms did you use in your prompt to make it use the full amount of tokens?

1 Like