Without testing it, is the result the same as “select all rows in a table, delete, import new data”?
Client has realised they can do this manually as stated above, but I’d prefer to automate this to keep it controlled.
Please let me know if you have experience using this endpoint so I can understand the behaviour before spending time testing. Thanks in advance
PS: If I clear table data so there is only the header row, then upload/replace the data with a .csv file with say 5,000 records, what does the API usage look like?
If I replace a table (remove the table using api then replace, which I haven’t looked at how to do yet), then is that one API credit per row replaced/imported?
With V1, which works with any tables, it’s one update per mutation (add/edit/delete).
With V2, which only works with Big Tables, it’s 0.01 updates per row changed.
Note: if you manually import a CSV file via the Data Editor, that’s zero updates regardless of the type of table or number of rows.