1 Reply Latest reply on Feb 6, 2019 10:36 AM by Santiago Sanchez

    Tableau In Memory Database Utilization

    Vinay R


      I have a requirement where i will be receiving multiple xml files per hour that have to be converted to excel and load the data into Tableau In-Memory and generate a Dashboard.

      After converting to excel each sheet contains more than 5000 rows per hour.

      But my concern is excel can handle data around 400,000 rows.

      How can i use the excel as data source for long run and how can i properly utilize Tableau In-Memory Database for this requirement?


      And also here we dont have option for an ETL or DB/ODBC connection.

      Source would be Excel only.


      BTW i have converted the xml to excel using python scripting.


      Suggestions would be appreciated.

        • 1. Re: Tableau In Memory Database Utilization
          Santiago Sanchez

          Hi Vinay,


          I'm not sure how you can extend the row limit in Excel (although FWIW I remember it used to be 1M rows, but that may have changed). That said, if you are already using python to convert from XML to Excel, then perhaps you can skip the Excel step altogether by using the Extract API - Tableau. The idea would be that you can convert the XML directly into a Tableau Extract (.hyper for example) in your python script.


          The documentation has a few examples on how to use the Extract API, including some in python: Extract API 2.0 Samples - Tableau


          Hope this helps!


          1 of 1 people found this helpful