Big Tables lookups into Regular Tables and Webhooks - Performance Experiences/Expectations?

Hi folks,

I have a very simple app in final testing. I’m seeing inconsistent performance of lookups from a glide big table to a glide regular table from a single relation (Row ID match for a location). Big tables (even with its limitations) was one of many selling points of jumping into Glide over alternative solutions.

In Big Tables | Glide Docs mentions limitations of “Multi Lookups into Big Tables (eg. Lookups via Multiple Relations or Lookups that target an entire Big Table column)” Emphasis on into big tables not from and via Multiple Relations. This is from a big table into a regular table and via a single relation. I have a support request open on this as well just seeking anyone else’s experiences or guidance here. This app is really important to the company and could affect as much as 30% of order volume.

Background on the app:

Two tables:
Big Table is a pickup request. Regular table is locations. Tables are related via text field “Location” in the big table which is the Row ID in the regular table. Big-Table has lookup fields via this relation for name address, city, state, zip, latitude, longitude, etc. This pick up requests big table will exceed 25,000 rows quickly when released to production. The locations regular table will likely never exceed 1,000 rows.

How the app works:

The submit action:

  • Set the column value for location in big-table based on the Get part of URL. That part executes just fine and always comes through to the webhook. This is also the key to the relation for the lookups.
  • User is navigated to a confirmation tab
  • Webhook is triggered to Make.com relaying data to an API for another piece of software

The challenge:
Sometimes the relation derived lookup fields are being sent to the webhook and other times not. It appears to be a performance/timing issue on the lookups in the big table into the regular table. I have introduced as much as a 10 second (!!!) wait before the webhook into the action and I am still seeing this issue intermittently. Without the lookup fields the data sent to the webhook is useless.

It is critical that the user get an accurate response that the data was successfully submitted.

Thoughts:

  • I could expand the set column values part of the action to set text fields in the big table from the lookups instead of relying on lookup columns in the webhook part of the action but that seems totally redundant and backwards.
  • Make the regular table also into a big table? Unclear if that has performance advantages.

Response from support was to add a “Reload Query” in the action before the webhook on a big table for this scenario. This does seem to resolve the issue though it us unclear why this is necessary nor is this documented.

Useful knowledge for those adopting big tables and using custom actions like webhooks.

If I see this happen again in testing I will bring it back up.

2 Likes

That’s good insight. I would have expected the lookups to work normally when you’re on the same device, but seems like introducing the Reload Query action absolutely forces them to be recalculated.

What I would have done if they don’t suggest the Reload Query action is to either add the lookup fields somewhere on the same screen (to force them to be calculated), or add a wait for condition step in your combo action to only run the webhook step when those fields are not empty.

1 Like

@ThinhDinh similar thoughts as well.

How I figured this was working (not how it actually works):

  • The Row is committed when the submit button but has no relation as the relation value is being set but the Set Column Value step.
  • The set column value is sets the relation and is a record change so I would expect the lookups to be refreshed because a relation now exists.
  • I also expected the webhook to be performing a fresh query from the database rather than the user’s cached version of the record.

A bit concerning to be honest. I would think a webhook should automatically generate a fresh query.

Turns out that (all?) actions live on the frontend and don’t look for backend changes with big-tables. I assume that this is purposeful design for data efficiency. Certainly more efficient to do it this way that they have but more likely to cause user and developer headaches and confusion.

Anyway happy to share the experience.

1 Like

Ya, but now I think you know that’s not the case. I had issues with sending values through webhooks in the past, and I pretty much always have to introduce a wait for condition action if I need something to be recalculated right before I trigger a webhook step.