Can you help me understand what the slowness of typing in the search bar can depend on?
It happens that, if I have to search for a word, when I type it in the search bar, only the first letter of the word appears and then, after 4 secs, the remaining letters of the word appears all together.
I don’t think it’s a problem about rows quantity in the table… May depends on the quantity of relationships that are set up in the table?
Which contents of the row exactly does the search function affect?
How many rows are you working with? I’ve seen it slow before on someone else’s app, but they were searching through thousands of rows. That list had to reload each time a letter was typed and the more letters that were typed allowed the list to load faster because the list got smaller and smaller.
Having a lot of rows can mean a lot of data to cache on the device.
The search will typically search through any visible data in the list as well as any visible data in the details screen of the list items.
That’s about 8000 rows. But the problem doesn’t seem to be due to the amount of rows, @Jeff_Hager, at least in my case.
I just now tried with a clone of the slow-table and it is very fast since it has no defined relations.
Now I’m re-entering one relationship at a time and see how it behaves.
I’ll let you know soon.
Thanks for your interest!
I found out why …
With the relation I built a data string for the caption of the list. These are additional product barcodes, the caption was useful precisely to have the ability to search for an item through multiple barcodes assigned to it. So at this point I think this is normal behavior, since I’m asking to search through a lot more data.
However, the fact that the search bar freezes is not very nice for the APP user; maybe it could improve, at least with a wait icon in the middle of the screen that appears during freezing.
Hi, unfortunately it depends a lot on the complexity of the table.
I have adopted a simple method for the lightning-fast search, but it is not applicable in all operational contexts; I’ll explain it to you, maybe it can be partially useful.
Let’s start from the limits it has:
needs a target column (but that target field could also be a column template)
the research input is case sensitive
produces a result only if the exact word is found
In your search target table create a column and convert all strings to lowercase
split that column using space, at this point you have an array of lowercase words
create a USC column
put in multiple relation the USC column with the split column
produce the search result with an inline list that has the multiple relation as its source
As I told you, this method has limitations, but for me it was useful to overcome the problem; I could apply it because the usage scenario is in an administrative context; I would see it not very applicable in a UI intended for public users.
I had a lot of trials and I have more than 20 to 30 apps running and it depends on how you view view the rows and that particular screen that you are searching in and it does not rely on complexity of the columns nor the column count because I digged in and tried to troubleshoot it and the only way is to slice this type of rows to a different views.
Example Let’s say that you have more than 8000 rows and you have a column which is consisting of an email and you’re trying to view that for only the assigned users and the normal way glide developed you should assign the row owner so you don’t load all the rows at once to the client side however It takes a lot of time to compute at glide servers before it goes to the user side.
The work around I did is to create a relation in the user’s table and in the screen instead of viewing 8000 rows 8000 rows to the users so I only gave them They related assigned rows to them
And the above mentioned example as you can see I sliced out the rows into several views and the email example is similar to a Category so in the 8000 rows group them out and and try to create in Google sheet a filter sheet to only view the groups and then relate the rows that are related to the particular group.
This will give a split view by choosing the category the users need to search and before getting into the hole 8000 rows
it’s good we’re looking into this!
And thank you very much for reporting your experience.
I wished very much if others wanted to intervene too, especially someone on the Glide team to have a more official opinion than mine.
Watch the video, I took a velocity test:
SXAPP. The application on the left is completely new, it contains a Catalog table that has been copied identical from the application on the right
DXAPP. On the right, the application is the one that is in production and that suffers from slow search.
SXAPP was overloaded with mockup rows in other tables, it contains over 25,000 rows and few columns.
DXAPP is in its normal condition of use, it contains about 12,000 rows but a lot of columns.
Both applications have the same screen with only one inline list component which source is the sheet catalog, obviously the sheets are in two different files.
The same sort type and same grouping type has been set.
Both applications have the same type of pro license.
From min 1:13 of the video, you can notice the super slow response time of DXAPP compared to SXAPP.
From min. 2:12 of the video, you can see that the response time is very long even to go back from the detail screen.
From min. 2:31 of the video, there is this behavior: if you don’t close the X of the search, but simply write a new key instead of the previous one, then the response time is fast even in DXAPP.
I wanted to do this demonstration to experience the influence that a large amount of rows can bring, compare it to the large amount of columns (or rather: a simple table structuring versus a complex table structuring).
In my opinion, the amount of rows has less impact than the data structure of the row.
I hope this experiment will do something to better understand the performance optimization strategy.
I’ve handled images slightly differently. I store them in an images table with a key column and an image column. Then in all the other tables, I create a template column for the key value, and a relation and lookup column to pull in the image. I can still control all images from one global table. I don’t know if it’s more efficient or not as far as speed, but I don’t have to keep adding columns if I need more images. I just add rows.