Let's start with the premise of 3500 concurrent users. How did you come up with this number as it seems really big? What is your use case?
1 of 1 people found this helpful
There are 3 main areas that drive the sizing of your Tableau Server infrastructure.
1. Concurrent workload. This is the number of users who are connecting to your Tableau Server, and making requests. ie they are clicking on a dashboard and making the server work.
On the white-paper Tableau factor in a 10% concurrency rate. ie 10% of the total number of users who can access the system are accessing the system at any one time. However, your environment may be different to this, so this needs to be confirmed (or guessed at). Your question states that you have 3500 concurrent users. Is this true, or is this the total number of users who can access the system?
2. Dashboard Complexity. The complexity of a dashboard will determine how long it takes to render. For instance a simple dashboard that shows one piece of information, will be much quicker than a dashboard pointing to the same data source which has multiple sheets, has loads of table calcs, maps, etc. Therefore it is worth looking at and testing your dashboards. If a dashboard takes longer than 5 seconds to render on Tableau Desktop, when it is on Tableau Server it is unlikely to be any faster.
3. Your Data. The Size of your data, espicially if using extracts is going to determine how much memory you need on your server. Again, the same with the number of extracts you are using.
Also, how much query caching will be utilised by your reports. Every time some runs a query (i.e clicks on a report), the results of that query are cached. If someone else runs the same query, then they will use the results in cache as opposed to querying the Data Source. This can massively improve dashboard performance, if end users are all running the same simple reports where they are mostly viewing the output. If they have reports with lots of interactivity (filters / parameters) then there may not be as much opportunity to use caching.
My advice would be to perform testing.
Build out a test server environment, with your workbooks and datasources, and use Tabjolt to put some load on to the server for different concurrent workloads. You will be able to see whether the server environment is adequate for your requirements, or whether you need additional resources (memory / cores, etc)
The answer above is a little simplistic but the answer to your question cant really be given without having a good understanding of your environment, and your requirements.
Hope this helps
To add to what has been said (understanding the foundation, users etc..)
Also see this article on processes , resources and performance to help you understand which processes use what. ex: Cores= backgrounder etc...