Here is a good link to find out what's going on behind the scenes: Performance Tip: Tableau Performance Recording - Biztory
In summation: for Desktop- Start Recording > Perform actions > End Recording > New performance workbook will show > Find out what is hogging the time.
Hope that helps!
Hi Matthew Risley,
Thanks for the answer, really appreciate it. I've already used the performance recording and it shows that the executing queries and computing layout took 50-60 seconds in total. and after we used the optimize extract (by this, I mean creating materialization for the calculated fields), the time it took to execute queries and computing layout reduced significantly. So, that solves the performance problem.
But the thing is, the data extract resides in the Tableau Server, so to be able to optimize the extract, we need to download the .tdsx, optimize, and republish it to the server (which located somewhere else). And with the current extract size and bandwidth that we have, it's going to pose some problems.
Is there any other options to optimize the extract, i.e. directly optimizing it in the Tableau Server?
When you mention downloading the .tdsx, optimize, and republish- what steps are you taking to do the optimization? (Example: are you filtering the unused field?)
what we do was actually materializing the calculated field by using the optimize option in the extract (I'm using the Sample Superstore data for the screenshot)
We've filtered all unused field from the ETL process, so there's nothing else we could hide in the extract. We've also added additional processes in the Tableau Server to increase the performance.
I do not believe you have to use the optimize button every extract.
According to these posts, you are in luck! Server published extract refresh optimization Does Optimize Extract Feature re-optimize after a scheduled run on Server?