1 Reply Latest reply on Feb 27, 2012 8:37 PM by Dimitri.B

    Best way to deal with complex queries when doing extracts.

    Tim Uckun

      I have a very complex star schema with numerous tables. I am currently extracting data to tableau using a view I have created on the database but I am wondering if there are more efficient ways of doing this.


      One issue is that the data is getting rather large and it's taking a pretty long time to extract millions of rows into tableau and every workbook has the entire data extract.  One way to redo this might be to create a new data source for each workbook using only those tables needed for the graphs and adding each table as a data  but is this really more efficient? Wouldn't tableau create lots of extracts and take even longer? Which of the following options are the best way to deal with this.


      1. Create a view on the database server (or perhaps a throwaway table).

      2. Create a data source in tableau using an SQL statement.

      3. Add each table to worksheet and connect them using the tableau GUI.


      Anything else I have missed?

        • 1. Re: Best way to deal with complex queries when doing extracts.

          I wouldn't go for option 3. I don't know if it will improve any performance, but it will definitely cause extra headache compared to star schema or database joins.


          Option 1 with a table is a good approach only if it stores processed or aggregated data. But then you have to factor in the time spend building or updating the table, unless that is not an issue, i.e. can be scheduled overnight.


          Option 2 is what I use all the time. I only have to wait once - for the extract to be created. After that the development is fast on the extract, and once the viz is published, the extract refresh is scheduled overnight - so it is not an issue. With version 7 data server feature you can publish one extract and share it between many workbooks - no need to run duplicate extracts.