2 Replies Latest reply on Jun 10, 2013 5:29 PM by Dimitri.B

    Tableau server storage capacity/mechanism and performance with huge data extracts

    SKR A

      Hello,

       

      I need some information and advice around the Tableau server storage capacity/mechanism and performance with huge data extracts. In our case, we have dashboards developed in Tableau which needs to be refreshed with the latest data on a daily basis. We have around 15 tables (each 10-12 columns), out of which 2-3 tables have around 400-500 million records and all the others have around 1 to 2 million records. The data increase per day is expected to be around 10k to 50k records. We are planning to have the data in extracts and refresh them daily. So my questions are:

       

      1. Is ther any limit on the storage of data on the Tableau server?

      2. Can we have all the 15 tables as Tableau extracts with the amount of data I mentioned above?

      3. How will the tableau dashboards perform with such a huge amount of data, considering each dashboard has 3-4 reports and has 1 to 2 data sources? you can assume any common server configuration and advice me.

       

      Regards,

      SKR

        • 1. Re: Tableau server storage capacity/mechanism and performance with huge data extracts
          Toby Erkson

          1.  Yes, for hard drive storage space.  Number of extracts and their size are factors.  I'm guessing that log files and backups will consume additional space depending on how many and how long you keep them.

          2.  Don't think/treat Tableau Server as a duplicate data source.  The extracts are just that, an extraction of ONLY the necessary data for the visualization task at hand (or worksheet that is visible).  Do NOT bring in data that you are not using!

          3.  Get as much RAM and as much processing power (CPU cores) as you can get.  Especially RAM.  How they perform is anyone's guess because we don't know what visualization they'll have, what calculations they're performing, etc.

           

          I know Yahoo! has used Tableau with success and they deal with terabytes of data -- daily -- so Tableau must be doing something right

          • 2. Re: Tableau server storage capacity/mechanism and performance with huge data extracts
            Dimitri.B

            The simple answer is: yes you can, and I am not aware of any physical limit on storage apart from your hardware/OS limitations.

            However, unless you have a monster-of-a-server, your views will be painfully slow if you just dump all the data into extracts without aggregating it.

            I would recommend aggregating and optimising extracts to keep them as small as possible, not because of storage limitations but for views performance.