OpenAI "Ask a question about a table" ceased working

Yesterday I built this app using the OpenAI plugin.
It was working great untill yesterday evening.
Also today i did many requests but it didn’t work.
According to my openai account today they didn’t even receive any requests.
Is this a bug on the Glide side?

Hi @Tim_Gestels

Thanks for reporting this!

Could you tell me if there is any error on the builder when you send a request to OpenAI? If not, requesting Support when you are logged into Glide would be great, as we pass critical application information to the support form. You should have the App you are having the issue with open in Glide when you start the support request.

There is a Help icon at the bottom right side of the screen. Selecting Support from the menu will direct you to the Support webpage.


Hi Santiago,

I get the error: Ask a question about a table: Invalid input table

But I don’t remember changing anything.
And yesterday this was working.
Is this an issue on my end?



@SantiagoPerez I deleted and recreated the action and now it suddenly works.
If it happens again I’ll let you know!
Thanks for the feedback anyways!

Havent used this yet, but want to ask a question. Theoretically you could ask OpenAI a question and it will look at your table and give an answer?

Ex: table of shirts with ratings. You could ask what is the highest rated and it will give you that?

It should work that way.


Or you could use a rollup column :wink:

Well I thinking if people wanted to skip all the filters and just ask “Show me the highest rated Italian restaurant near me”, it could look at your restaurant table, filter out all italian and then look at the highest rating and pop out an answer. Is that possible?

Yeah, although it may not be 100% reliable. I was doing some testing with it early on and was trying those sorts of queries and getting inconsistent results. After a conversation with @Jeremy I realised that isn’t really what it’s intended use case is. It’s better suited to less deterministic and more open ended questions.


Hi @Darren_Murphy /Jeremy, can you clarify what the intended use case is in that case?
By the name of it it sound quite a lot like this is the intended use case :slight_smile:

I would suppose If you do a bit of prompt engineering (adding additional wrapper text to explain the model how it should interpret the question and how it should answer) combined with perhaps the newer OpenAI models it should give more consistent results right?

@SantiagoPerez even after rebuilding the thing I keep getting the same error. Should I ask for support? In your message you told me to ask for support if I didn’t see an error message …

Here’s a brief movie of what’s going wrong, If you have any additional questions let me know.

update now it’s working again :joy:
can’t put a finger on it …

Yes, please! Submit a support ticket with this video attached. So we can review closer.

Do you have any columns selected for each table? Click the “Select columns” below the source table picker to confirm.

@jeremy I have selected all columns.
Perhaps this is a bit too much for the prompt?

This morning it was working again quite well, then suddenly I got this error:

“Ask a question about a table: This model’s maximum context length is 4097 tokens, however you requested 4483 tokens (4228 in your prompt; 255 for the completion). Please reduce your prompt; or completion length.”

The only additional context I gave is the following: “You are the expert on the data in this app. Give a short and concise answer and try to answer in one go.”

Is this perhaps due to the additional context data that glide fetches from the tables? In that case that would mean this only works for smaller datasets?

That’s a quite long prompt. How many rows and columns are we looking at here?

There’s not a lot of rows in the app yet since this is just a test. But there are a lot of columns.
The question itself is just one line so I suppose in the background glide is fetching data from the tables and including it in the prompt … Probably that’s where this is going wrong? I would like to know how this works so I can design it around the limitations.

I would think that they have to include all the columns you chose to include in the prompt, and do that for all rows in the table. That’s why I asked if we’re doing something crazy columns-wise or rows-wise to get it to that many tokens.

@Tim_Gestels David himself confirmed that when you do a prompt, here is what would happen.


So in short, not all rows, only X “most related” rows.


Try selecting less columns or reduce the overall rows given to the action for now. Their api doesn’t handle embeddings that large at this time. They might fix that in the future or we might provide a better workaround to cache the embeddings on our side to overcome this.

1 Like