🆕 Public Beta: Glide API v2.0 for Big Tables

Glide API v2.0 lets you work with Glide Big Tables programmatically.

The new and improved API consumes significantly fewer updates than v1.

We’ve also introduced stashes to let you easily import thousands of rows at a time.

Try it for yourself in our interactive API docs.

Note: API support for regular Glide Tables is coming soon.

Pricing

  • Add, Edit, or Delete rows = 0.01 updates per row changed (100x cheaper)
  • Get rows = 0.001 updates per row (no more rounding to 1 update per 1k rows)

Walkthrough

8 Likes

This will be huge. I love this API version and have been using its since beta. Thank you!

1 Like

Hi DJP,

This is a great feature to have and really appreciate the work put here.

I was just wondering that if we are going to get Math column values or Rollup column values in the future versions of this API? is that something that will be considered?

@DJP Wanted to know as we are planning some features based on this.

Regards,
Dilip

I don’t think it’s likely, but is there a reason you don’t build your stuff around workflows that have access to computed columns?

Agree with Thinh here. If you are trying to use or perform a computation programmatically, workflows are your best option. Glide doc here for reference.

1 Like

Great! This looks awesome.

I was just wondering about query parameters? In v1 of the API for big tables we could query with a limited set of SQL — can we still do this for v2 when getting rows?

That’s a good question and I used it a lot, I wonder if this is supported in v2 as well.

If not, I would just use the v1 method for it :sweat_smile:

2 Likes

Does that mean the make and zapier integrations consume only 0.01 updates when used on big tables?

No, this only applies to using the Glide API v2.0

1 Like

Hi @ThinhDinh @DJP ,

Thank you for replying. My computed columns are basically Adding values from Multiple columns, Or multiplying, Or even rollup Columns etc,

These dont come in API response right ?

Regards,
Dilip

No, they don’t.
As a general rule, computed columns are evaluated on the users device, and so they are not available in the backend, which means the API does not have access to them. A further complicating factor is that computed columns often give user specific results, and API queries cannot emulate any specific user.

3 Likes

Hi @Darren_Murphy ,

Thank you for replying.

If I have to get those values How can I work around them?

Regards,
Dilip

You should use the new workflows if you intend to do something that’s related to computed columns. But granted, not all computed columns are supported, e.g JavaScript is not.

4 Likes

Hi @ThinhDinh ,

Thank you for replying. Let me try this and get back.
Regards,
Dilip

1 Like

Hey Darren and all the Experts and @NoCodeAndy :

It certainly seems GBT is the future but still with caveats and work-arounds. I am finally about to move 4 google tables to glide big tables and want to make sure my App doesn’t break completely. The ‘good’ news is I don’t do too much ‘fancy’ in these google tables although I use the hashvalue computed column to create an Index that I write into other tables and then use as a Query/LookUp (I think that will break) as well as Querying computed columns between the two soon-to-be GBTs. The return set of these are <100 but not sure if this query is allowed. (FYI these computed columns strip characters to find ‘fuzzy’ matches between the two tables).

Some general questions

if computed columns are usually evaluated on the users device:

  • What are the 'today’s" design implications of using GBT with the computed column restrictions
    –Does and Don’ts. Specific work arounds?
  • Is there a list of computed columns that are processed on the server versus the client?
    – LookUps? ITE? There have been (and continue to be) upgrades/changes to GBT and I am guessing this has to do with where processing occurs
  • Is their a recent video or two or three highlighting the upgrades to GPT and design considerations (@Robert_Petitto @NoCodeAndy ??)

I have looked through pretty much everything in the community but much of it doesn’t take into account the latest updates to GBT which seemed to have major implications.

Thanks as always

Is it possible to delete rows in bulk with a single api call like it used to work with mutations?

Good question. I believe it’s a no considering the “delete” docs.

However, I have never used this. If I need to delete rows, it has almost been a case of replacing the whole table with latest data so I use “overwrite table” anyway.

1 Like

Just want to drop a big thanks for this feature, I’m going to implement it right away in a few places and I’m thrilled about how many headaches and compromises this resolves :sweat_smile:

Fun fact is that for one project with invoice line-item reading we run into an API call cost issue because of the volume and insane number of items on some documents.

So, I came up with an idea and developed a fully functional work-around allowing to edit, add and delete line items, based on a single JSON array dynamically parsed out to display in UI.

And I did this… just one day before this post dropped :rofl:

Glad I built that because it was fun but even more glad I’ll never need to do it again :grinning_face_with_smiling_eyes:

3 Likes

How does pricing work for “server-side” updates?

Consider a workflow that loops on a big table that contains 30,000 rows updating a date column.

Is that 30,000 gets plus 30,000 edits or $30 + $300?

Thanks

1 Like