Experimental code columns to compress large data

I needed to export a computed column value that was too large to send externally via webhook (a JSON string of about 10,000 rows, about 1.5 MB), so I created a string compression column.

Using the pako library created by nodeca, I was able to compress it down to 44KB, which made exporting the data easier.

My plan does not allow me to use the API for downloading data, so I use it instead. This may be useful for people in a similar situation.

Please share your experimental code columns with us!
Thank you.

This is interesting. What does your full flow look like? I assume you deflate it and send the ressult through the webhook, but where does the “inflate” part come into place?

You can join columns and send out with fetch Json… much easier

1 Like

The webhook triggers an aws lambda function, which performs base64 to binary conversion and inflation within the function, and then aggregates and edits the data in pandas.

My goal is to send large data to the outside at any given time. The application only compresses the data.

I had not tried Fetch Json. Thanks, I’ll try it next time.

At first I sent 1.5MB data via webhook. But the app and browser froze. When I looked at the developer tools, it seemed that the data was failing to be sent. I wanted to compress the data somehow.

Experimental Code Column refers to externally deployed code, but the code is executed locally. I chose this column because there is probably no data transmission overhead. I am a Glide beginner, so my understanding may be wrong.

Translated with DeepL