API Calls as a solution to the Filter/Search problem of Big Tables? - Price Issue

Hi there!

Currently Glide asks around $0,01 per update/edite of a row (you can buy 1000 updates for $10). However, when using new API call action to update rows in Glide, this will lead to also $0,01 for each update.

I saw API calls as a solution to some missing functionality in Glide Big Tables. In particular not being able to filter/search on calculated fields. For example if you have a Team Big Table and a Goal Big Table and you connect Goals to Teams through a relationship of the Team’s Row ID, you can still lookup the Name of The Team with a Lookup Column but you can’t filter on that name as it’s calculated.

Instead of creating just a Lookup Column, you can create an extra Text Column and dump the Name of the Team in the Lookup Column every time you create or edit a Goal. This way you can use the Filter/Search like with a normal Glide Table on the Team’s name.
However there’s one issue. What if you edit the name of the team? The dump in the text field will not be updated automatically.
In this case you can add an action that every time if you change the name of a Team, you do an Api Call and change name in the associated Goal’s Text Column.

This would work perfectly in a technical way, however given that it’s $0,01 per update, you will soon become bankrupt as one Team edit can lead to hundreds of edits on Goals.

The result is that it becomes best practice to build relations on ‘Names’ rather than on ‘Row ID’s’ with Big Tables if you still want to be able to filter or search on the name. Changing names in this case leads to losing relations which is not that user friendly of course.

Any thoughts on this?

Nathan

You don’t need to use Call API for this – you could just set the column value in the action (which may also use an update anyway).

How many times do you think this action would run each day in your app? Our plans include thousands of updates–unless this is happening hundreds of times per day, I doubt you would even notice, let along go bankrupt.

I don’t think that would work in this case, because presumably there would be several rows that need updating. And it’s not possible to do a Set Column Values through a multi-relation/query… :man_shrugging:

3 Likes

@david @Darren_Murphy

How awesome of you two to join the chat! (It’s an honor to have you David)
I made a wrong assumption. Let me give an example. (If you don’t have time, the conclusion is in fat at the bottom of this post but I think it’s harder to understand without the example)

Let’s say you have a project with two Big Tables:

1. A supplier Big Table → A supplier can have multiple Products

2. A Product Big Table → A product can be associated with one Supplier

I know that you can log calculated fields of big tables in simple text fields with the ‘Set Column’ action every time you create or update an item in a table. This is very useful as it still allows you to search, group and filter by a culculated field through that log field! I did this by logging the calculated 'Supplier Name" lookup column in the ‘Name Text Log’ Field in the Product table as you can see below.

The problem however with the above is that when the name of your Supplier changes, the Name Text Log field in the Product’s table will still show the old Name (while the Lookup field will change, which is irrelevant because we unfortunately cannot use to filter/sort/group) so filtering, sorting and grouping will happen on the old name.

My goal thus was to also change the ‘Name Text Log’ Column in the Products table, whenever the associated Supplier name was changed. I thought I could do this with an API call to the Products Table (with for every associated Product one mutation) whenever the Supplier’s name was edited. This would become expensive because if a Supplier has 300 associated products for example, you would pay $3 for one edit (You can buy edits for $10 for 1000 edits or $0,01 per edit).

To be able to create such an API Call and add it to an action, you would have to create a Template Column with the Body of the Api Call with for every associated Product one mutation. You can construct that mutation with a template column in the Product Table. I did this in the example with the ‘JSON Update Element’ see below:

However, to join the mutations into one JSON Array. You would have to create a Query Column in the Supplier Table which finds all associated Products. And then create a Joined List of the ‘JSON Update Element’ column of the associated Products.

However creating Joined Lists on calculated columns in Big Tables is not possible so you can’t possibly create an Api Call that changes a column value in all associated rows.

Therefore my remark becomse rather hypothetic and consists of two rather than 1 remarks:

1 - updating the Simple Fields in an associated rows in another Big Table (in order to use in filters/groupings/searches) when an entry in the current table changes in bulk is not possible. That’s a pity and changing that would be awesome.
2 - Secondly, if 1 would become possible. The current pricing of Glide wouldn’t work as 1 requires a massive amount of updates.

Finally, if 1 and 2 were to be changed. Big Tables would have almost the same functionality as Normal Tables (apart from the loading times of course) which would be massive in terms of app reusability. Instead of constantly copying my apps (with Templates), I would just be able to make every app with Big Table and onboard new customers on one and the same app. Meaning that any update I do in the app, I have to do only one time.

Finally 2: another and maybe even better solution to the above (maybe changing big tables is not possible) is that you would be able to create a ‘Master’ App and that copies from that Master App would always follow changes you make to the Master App. This way you could create SAAS like apps with Glide very easily. **
This would be an enormous increase in value proposition of Glide (more people would be able to onboard Glide apps → Agencies would earn more money → Glide would earn way more money). I will add this as a new feature request!

Yes, that’s what I understood.

That wont work - as I guess you’ve discovered - because you can’t take a joined list of a computed column from a Big Table.

However, there is a workaround - and this is something that I have used, so I know it will work.

  • Start by creating a Query, matching the SupplierID of the one you want to update.
  • Take a Joined List of RowID’s through the Query (this will work, because it’s a non-computed column)
  • Feed the Joined List into a JavaScript column that processes the list of RowIDs, and outputs a JSON collection of API mutations, each of which updates your Text column with the new name
  • Use the JavaScript column in a Call API action

As I said, this works perfectly well. I’ve actually been using the technique to good effect just in the past few days. The only thing you need to watch out for is the upper limit (500) on mutations in a single API call. If you need to update more than 500 rows in one go, then you’ll need to batch them up.

oh, I should add - even using this technique, you’ll still be using an update for every mutation - so that part of it won’t change.

1 Like

yes, you can - see my previous reply :point_up: :wink:

1 Like

What an awesome Idea! I was tinkering with the Row ID’s too but as my knowledge of Javascript is nearly non-existing, I didn’t think about this.

Note to self: learn javascript!

However, I found a workaround with Glide AI which is totally No Code:

And the result!

Gotta love the power of Glide <3

2 Likes

hahaha, 10/10 for thinking outside the box :rofl:

It took me at least 20 minutes or more to write and test my JavaScipt, and you got the same thing in 5 :facepunch:

1 Like

Allright @david. Now that we know that you can actually achieve updating simple text fields of associated rows in bulk (up till 500 row edits at a time which will cost $5) and that this solution puts Big Tables more on par with normal tables, isn’t the current pricing model for row edits through API-calls a bit too limiting? Allowing a friendlier pricing model for API Calls would make really big applications (lots of users/data) more doable in Glide vs going to another No Code tool (which would be my standard thinking now).

Of course, implementing this change (which is probably huge in development work for the Glide Team but also fundamentally changes the amount of value Glide can deliver to the world) would also be a solution as it allows using Normal Glide Tables in SAAS-like solutions: For Agencies: "Slave Apps" that follow changes in a "Master Apps" (Semi-SAAS/Template Alternative)

Beware with the output though, make sure it always only return the JSON and not anything else. If it’s too nice and say something like: “Here’s your JSON” then you’re screwed.

1 Like

Hi @Darren_Murphy

I have a use case where I might have more than 100 rows in my Big table.

So I was thinking of using this solution.

I first created a queryycolumn which checks all rows where CreatorName is not empty

I pass this through a joined List which gives me ID of all rows separated by comma.

I am stuck here as I am not well versed with Javascript,

ChatGPT gave me this script

// Function to simulate fetching or initializing table data dynamically
function getCurrentTableData() {
// This data would normally be dynamically fetched or updated
return [
{ rowID: “row1”, p1: “dynamicValue1” },
{ rowID: “row2”, p1: “dynamicValue2” },
{ rowID: “row3”, p1: “dynamicValue3” }
// Additional rows can be dynamically added here
];
}

// Function to generate JSON array based on the latest data
function createJsonArrayFromTable() {
const tableData = getCurrentTableData(); // Fetch the current dynamic data

const jsonArray = tableData.map(row => {
return {
“kind”: “add-row-to-table”,
“tableName”: “native-table-jWEMMuChAFBW3berD6UM”,
“columnValues”: {
“Name”: row.p1, // Dynamically set from each row
“hUS5E”: row.p1, // Same as above
“3Fbda”: “false”,
“WBVZ6”: “false”,
“TeIW1”: “false”,
“VA1A4”: row.p1 // Dynamically set from each row
}
};
});

return jsonArray;
}

// Call the function to generate the JSON array
const result = createJsonArrayFromTable();

// Output the result
console.log(JSON.stringify(result, null, 2));

But this script I am assuming is asking for new rows to be added manually?

can you help me with sample you have? I could use it as reference.

Also I have a creator API mutation which takes in required value from the tow. Is it possible for me to pass this same thing to Javascript?
Screenshot 2024-09-04 at 10.18.42 AM

Thank you in advance.

Regards,
Dilip

Hi Guys,

Tried this Java script

let creatorAPIMutation = “p1”;

// JSON array of rows
let rows = [
{
“kind”: “add-row-to-table”,
“tableName”: “native-table-jWEMMuChAFBW3berD6UM”,
“columnValues”: {
“Name”: “WeeklyID”,
“hUS5E”: “CreatorName”,
“3Fbda”: “100%Compliant”,
“WBVZ6”: “StillWithUs”,
“TeIW1”: “InboundSafe”,
“VA1A4”: “EntryDate”
}
},
{
“kind”: “add-row-to-table”,
“tableName”: “native-table-jWEMMuChAFBW3berD6UM”,
“columnValues”: {
“Name”: “ughRV70bRrmMxq59sNLtCg”,
“hUS5E”: “Sibi”,
“3Fbda”: “false”,
“WBVZ6”: “true”,
“TeIW1”: “false”,
“VA1A4”: “ughRV70bRrmMxq59sNLtCg”
}
}
];

// Using the creatorAPIMutation value
// Replace the YOUR_CREATOR_API_MUTATION_VALUE with the actual value from your CreatorAPIMutation
let inputRow = {
“kind”: “add-row-to-table”,
“tableName”: “native-table-jWEMMuChAFBW3berD6UM”,
“columnValues”: creatorAPIMutation
};

// Add the input row to the rows array
rows.push(inputRow);

// Convert the rows array to JSON
let jsonArray = JSON.stringify(rows, null, 2);
console.log(jsonArray);

Where P1 is going to be input which takes template column which has API mutation separated by commas

still not getting any response from Java script.

Thank you in advance

Regards,
Dilip

Your JavaScript doesn’t return anything.

Here is an example of some JavaScript that I use to prepare a batch of mutations:

const rowIDs = p2.split(',');
const mutations = [];
for (let rowid in rowIDs) {
  const child = `{
    "kind": "set-columns-in-row",
    "tableName": "native-table-XXXXX",
    "columnValues": ${p1},
    "rowID": "${rowIDs[rowid]}"
  }`;
  mutations.push(child);
}
return mutations.join(',');

In the above case, p2 is a Joined List of RowIDs (all the rows I want to change), and p1 is a JSON Object column representing a single mutation. In my case I’m setting the same values in every row. I believe that your case is slightly different, so that code won’t work as it is for you. But it should at least get you started.

2 Likes

Hi @Darren_Murphy Thank you for replying.

I used chat got to get various solutions based on your Java script.

The latest one I got is this

const rowIDs = p2.split(‘,’); // Get row IDs from p2
const rowData = JSON.parse(p1); // Parse p1 as JSON, assuming p1 is a JSON string with row data

const mutations = ;

for (let rowid of rowIDs) {
// Safely check if data exists for the current rowID
if (rowData[rowid]) {
const currentRowData = rowData[rowid]; // Get data for the current rowID

const child = `{
  "kind": "add-row-to-table",
  "tableName": "native-table-jWEMMuChAFBW3berD6UM",
  "columnValues": ${JSON.stringify(currentRowData)}, // Add dynamic row data
  "rowID": "${rowid}"
}`;

mutations.push(child);

} else {
// Optionally handle the case where no data is found for a rowID
const child = { "kind": "add-row-to-table", "tableName": "native-table-jWEMMuChAFBW3berD6UM", "columnValues": {"error": "No data found for rowID: ${rowid}"}, "rowID": "${rowid}" };
mutations.push(child);
}
}

return mutations.join(‘,’);

Where I am trying to iterate through every column value.

I have put a if else condition to catch any error.

Always its going to error part where it says No data found.

Does glide not allow us to execute everything in one shot so that I can convert it as a string?

WIll I need to create one more temporary variable approach to solve for this?

Thank you in advance for help

Regards,
Dilip

Please show me what you are passing as p1.

Note: when pasting formatted text such as code, enclose it in triple backticks. That will ensure it is formatted correctly, eg:
CleanShot 2024-09-06 at 12.38.11@2x

will render as follows:

this is a code block

Hi @Darren_Murphy ,

I am passing individual JSON object as p1 EX:

[{“Name”:“ughRV70bRrmMxq59sNLtCg”,“hUS5E”:"Venkada Subramaniyan ",“3Fbda”:“true”,“WBVZ6”:“true”,“TeIW1”:“true”,“VA1A4”:“4 September 2024”}]

The JS code to get that is as follows

let jsonArray = [];

function processCreatorAPIMutation(p1) {
  try {
    // Check if p1 is a string (i.e., needs to be parsed) or an object
    let rowObject = typeof p1 === "string" ? JSON.parse(p1) : p1;

    // Check if the rowObject has 'columnValues', and push its value to the global jsonArray
    if (rowObject && rowObject.columnValues) {
      jsonArray.push(rowObject.columnValues);
    }
  } catch (error) {
    console.error("Error parsing JSON:", error, "Input data:", p1);
  }

  // Convert jsonArray to a JSON string before returning it
  return JSON.stringify(jsonArray); // Return the combined array as a string after each invocation
}

// The variable 'p1' will be automatically passed from Glide.
// The global jsonArray will accumulate the results from each row.
return processCreatorAPIMutation(p1);

P1 input to above script is my creatorAPIMutation which is a template column.

Regards,
Dilip

So you’re just passing data for a single row?

Hi @Darren_Murphy

Yes currently Its passing for single row.

I am assuming that my JS which has add dynamic row data will fetch all rows data and Print it?

Is my assumption wrong?

This is the code where I am taking each rows APIMutation value and trying to convert it into String. It always gives me only corresponding row value

let jsonArray = [];

function processCreatorAPIMutation(p1) {
  try {
    // Check if p1 is a string (i.e., needs to be parsed) or an object
    let rowObject = typeof p1 === "string" ? JSON.parse(p1) : p1;

    // Check if the rowObject has 'columnValues', and push its value to the global jsonArray
    if (rowObject && rowObject.columnValues) {
      jsonArray.push(rowObject.columnValues);
    }
  } catch (error) {
    console.error("Error parsing JSON:", error, "Input data:", p1);
  }

  // Convert jsonArray to a JSON string before returning it
  return JSON.stringify(jsonArray); // Return the combined array as a string after each invocation
}

// The variable 'p1' will be automatically passed from Glide.
// The global jsonArray will accumulate the results from each row.
return processCreatorAPIMutation(p1);

Which is input I am giving to previous JS sent

Somehow even this script is just giving me only corresponding row values but not the combined one.

Please pardon me if there is some stupid mistake I have no knowledge in JS and relying on chat GPT

Regards,
Dilip

Chat GPT said this to me when I asked about only sending in 1 value

Glide invokes the script for each row separately, and the function doesn’t maintain the accumulated values across these invocations. JavaScript functions in Glide are stateless between executions, so the jsonArray you are using is reset every time the function runs.

So was wondering is there a workaround?

Regards,
Dilip

Yes, your assumption is wrong. The JavaScript column will only work with the data that you give it.
Yes, it will process the data for each row, but each row will only contain the data for that specific row.

The idea with this solution is that you pass data for all rows as a single parameter, process that and output the result, then take the result from the first row in the table. It sounds to me like you should be passing multiple joined lists. A joined list of RowIDs, and then a separate joined list for each parameter that you want to add/edit. The JavaScript column only accepts 3 parameters, so you’ll probably end up having to create a joined list of joined lists, and then do a double split in the code.