2 of 2 people found this helpful
Very few extracts are created in seconds, even on relatively small file sizes. Tableau has to take a copy of the data and paste it if you would in a different format and language entirely, a .tde. However most extracts can be queries in seconds even when they are very large. Often if an extract is not performing very well it has to do with your harddrive needing to be defragged or you have too many calculations, badly set up calculations, etc.
Is there a specific incident or reason you had in mind? The width of the data is a bigger issue than the height of the data, at least that I've found. How many columns wide is your data, how much of this is strings instead of numbers? Please don't have full sentences in the data and be trying to use 15 contains statements... That will not make you happy.
Thanks Carl for your quick reply. The information provided is really helpful. Also, I would like to reiterate my question in a much better way. Please share your thoughts.
We would like to understand if there is any threshold on the Maximum Records / Size for a TDE file recommended by Tableau for achieving better performance?
For Example: A file size say <2 GB or 3 GB or with respect to record counts say 100 million or so. Also if you can share the maximum size of TDE you handled in your experience.
1 of 1 people found this helpful
I haven't heard of any file size limitation for .tde file.
Last year I have loaded a dashboard from a 3.8GB .twbx file... and it had over a billion records... (Tableau Desktop ver. 8.2)
This year I noticed that Tableau 9.3 got even better at data compression, and that same .twbx file could be further compressed to fit within 1GB file size
What's ur expected data size?