Define 'large'? Size, row and column count.
And to clarify what you've heard... Tableau could be used to visualize summarized data but what granularity it comes in at - it does not really matter besides potential performance implications.
If you choosing to create (and periodically refresh) extract then I suggest you look into aggregated extracts to see if they can be utilized:
- Quick Start: Aggregated Extracts
If you need a drill-down - no problem with that. Keep all the data. Tableau should handle it.
Thanks for the response.
The file is 79 columns, and about 50k rows. Before loading it to the server, we create a tableau extract and load that. Then the dashboard has 4 visuals within it, and maybe 15-20 calculations of varying degrees of complexity. Or at least I think that.
I'm going to look into the aggregated extracts. Thanks for the tip!