1 Reply Latest reply on Dec 3, 2012 8:20 AM by Catherine Rivier

    slow query response times on extract

    Ronak Patel

      I have an extract that is 39 million rows and about 25 columns wide(704mb extract size). I've noticed that the response times for querying the extract is about 30-40secs and am not exactly sure why but is there a "limit" to how big an extract can be to see acceptable performance times?

       

      I've done as much limiting to the data as I can but we're loading in a years worth of data. I also have about 7 global filters, 4 of which the users can select in the dashboards.

        • 1. Re: slow query response times on extract
          Catherine Rivier

          Hi,

          I can't answer your first question unfortunately, but I can offer one thing:  I found in the past that with very large data sources, parameters ended up being much faster that any filters, global or otherwise.  I think I ran into an issue once where filters were taking 20-30 seconds, and Actions took this down to about 10-12 seconds, Parameters to 5 (this was a multiple data source case).  So you might want to try building some parameters to replace your global filters, and see if those improve your speed.

          (Some info on setting these up is here:)

          http://community.tableau.com/thread/121495

           

          The one issue with parameters is that works best if you can only select 1 thing at a time.  If your users typically need to select multiple things, you might also try Filter-Actions to replace your filters.  Always good to try a few things and run tests, to see what works best with your data.

           

          Hope this helps a little!