Tableau Server Version Used: 10.1
I've been trying to fully understand how caching works in Tableau Server. My understanding is that Tableau caches the required queries when those dashboards are viewed and kept until the specified cache time in Tableau Server Configuration (Image below). However, those cached items will be removed if many other dashboards are loaded after that (assuming cache size is limited). I understand that cache will be invalidated for other reason as well but if we focus on this, is it possible to increase the cache size so the cache will be kept longer regardless of other dashboards are accessed?
The practical question I have is,
My server has many dashboards but I have several key dashboards that require faster loading times than the other. For this reason, I'm using the subscription feature to warm up the cache. However, since other dashboards are also used frequently I have many instances where my key dashboards are taking longer to load. The things I see that can be improved are,
- Increase the subscription interval during peak hours
- Increase cache time specified in Tableau Server Configuration
- Increase cache size to manage the load ( does not have any documentation regarding this on Tableau Documentation)
I want to understand if the latter 2 approaches are viable or not. The questions boil down to the following,
- If I increase the cache time will it have any impact? I'm asking this because if my understanding is correct, frequent use of other dashboards would kick them out of the cache regardless of the specified time.
- Is it possible to increase the cache size? I have seen this thread: Cache Sizes which talks about several server configurations exposed to change cache size. I see that they are not documented on Tableau. Does anybody have any idea if they actually have any impact on caching?
Note: Improvements to dashboards are considered and followed upon. I want to understand what improvements can be done to increase cache hit rate for the key dashboards.
Thank you for your time,