There is no practical file size limitation. We have had customers creating extracts of multiple terabytes. Updating should also be possible on such big databases (given enough disk space, of course). How was updating impossible? Did you receive an exception? If yes, what was the exception text?
After further investigation I noticed that that the hyperd.log files issues a warning about "memory-limit-exceeded" that leads to a "db-persisting-error".
Our script entails two steps, (a) deletes records and (b) insert new records. I noticed that while the whole process of deleting and inserting new records
is completed, and the process prints the updated number of records, exactly when the hyper file connection is about to close the RAM usage from Tableau
Hyper Data Engine has a huge increase. Interesting part is that when removing the delete statement the process is successfully completed. We are
using the python implementation.
this is behavior is known and is to be anticipated: When you update or delete rows, the RAM usage of Hyper will spike during saving the database, as the updated data is repackaged.
Try increasing the amount of memory. If this doesn't help, please give additional details on how large the workbook is (size on disk, number of columns, number of rows), how many rows you delete, and how high the peak memory usage is. Then I can make a bigger case for this limitation being significant to users, so our development can focus on mitigating that.