3 Replies Latest reply on Dec 2, 2019 4:02 AM by Jan Finis

    Hyper API Database Maximum Size

    Constantinos Lazarou

      I am currently working with hyper API latest version and trying to create database with a single schema and table. Although I was able to create the database it was impossible to update it after the hyper file reached a limit of roughly 2.5GB. I would like to know if there is a limitation regarding a hyper file size? Any ideas or similar problems faced by others?

       

      Thank you

      Constantinos

        • 1. Re: Hyper API Database Maximum Size
          Jan Finis

          There is no practical file size limitation. We have had customers creating extracts of multiple terabytes. Updating should also be possible on such big databases (given enough disk space, of course). How was updating impossible? Did you receive an exception? If yes, what was the exception text?

          • 2. Re: Hyper API Database Maximum Size
            Constantinos Lazarou

            Hi Jan,

             

            After further investigation I noticed that that the hyperd.log files issues a warning about "memory-limit-exceeded" that leads to a "db-persisting-error".

            Our script entails two steps, (a) deletes records and (b) insert new records. I noticed that while the whole process of deleting and inserting new records

            is completed, and the process prints the updated number of records, exactly when the hyper file connection is about to close the RAM usage from Tableau

            Hyper Data Engine has a huge increase. Interesting part is that when removing the delete statement the process is successfully completed. We are

            using the python implementation.

             

            Thank you

            Constantinos

            • 3. Re: Hyper API Database Maximum Size
              Jan Finis

              Hi Constantinos,

               

              this is behavior is known and is to be anticipated: When you update or delete rows, the RAM usage of Hyper will spike during saving the database, as the updated data is repackaged.

               

              Try increasing the amount of memory. If this doesn't help, please give additional details on how large the workbook is (size on disk, number of columns, number of rows), how many rows you delete, and how high the peak memory usage is. Then I can make a bigger case for this limitation being significant to users, so our development can focus on mitigating that.

               

              Cheers,
              Jan