Larger database

If I use GoogleSheet table with more than 10.000 records, response is very slow. For 100.000 not usable. I am not speaking about text search, it affects every user interaction. Is there any mean to avoid that shortcoming?


The number of rows isn’t the main problem in my opinion, the number of columns also hits the performance if a good schema or design in your data was not well done at the beginning.

Also, many relations or computed columns not used correctly can affect your APP.
May you tell us more details about your issues or when it happens?


Hola amigo!

No, this is virgin GoogleSheets table with 100.000 records. Firstly, Glide doesn’t want to create the app for this sheet. After small workaround I got it, but with 100.000 records, just inline list, delay was in several minutes. After reducing to 15.000, response time for displaying the list was 1-2 minutes.

No computed columns, no relatuions!

I did a similar test a couple of years ago with maybe 2 or 3 basic columns and 120,000 rows. I tried to display all of them in one list as well. I agree that it was very slow to load. But keep in mind that if you are trying to display all 100,000 or even 10,000 rows in a list at the same time, then that’s going to take a long time to render all of the HTML and underlying javascript code to display the list with associated code for the list actions. That HTML and javascript ends up being quite a large amount of data to cache and store in memory, which will fill up your device RAM very quickly and max out your CPU.

Plus, nobody is going to scroll through a list that big, so using proper filters and/or relations to reduce what’s shown on the screen is advised.

Glide will guarantee that it can sync 25k rows from a google sheet without issues. Anything above that starts to become unreliable due to limitations with Google’s API. It’s possible, but unreliable. But most importantly, trying to render every single row on the screen is a bit extreme, and you should consider some catagorization or filtering to reduce the amount of rows that are visible. Even setting the list to only show a handful of rows would be advised, which would allow the built in search to still work to search all rows (including those not visible).

I have a table in my app with 10k rows in one table, it takes a few seconds to load initially but by no means is it taking several minutes to load, and I am definitely not displaying everything on the screen. Mostly, my slowness is due to the large amount of computed columns I’m dealing with. Not the number of rows in the table. I only display maybe 200 rows at most, so it’s easily manageable and loads in a reasonable amount of time.


Ok, thanks for giving us more details.

Jeff has written a good case with this amount of data and its considerations. In your test, how many columns does your table have?

I have written about my test before: I have a GS working as API and when it has 45k rows and 6-8 columns, any reply lasts 4-6 secs but working with 34 columns (the same 45k rows), the performance is poor, the replies arrive in 9-12 secs (sometimes more).


1 Like