Has anyone found a workaround when pulling data from the API to generate a CSV where the rows are over 1000?
Currently needing over 2000 rows pulled each month.
Has anyone found a workaround when pulling data from the API to generate a CSV where the rows are over 1000?
Currently needing over 2000 rows pulled each month.
Big Tables or Regular tables?
If regular tables, you can use the continuation option and make multiple calls.
If Big Tables, the only option I know of is to split your query up into multiple calls. For example, if you are filtering by a date range, break that up into two smaller ranges, make two calls, and concatenate the results.
Big tables, attempting to do it in make (loop back each time at a 990 limit just to be safe), would you say its easier to just do seperate calls in glide?
Your approach sounds okay.