1 of 1 people found this helpful
Here is the related white paper which you can download
Good answer from Mark
The answer to what is right for you will depend on a lot of factors like:
1. How much data you will be extracting every day
2. How complex your dashboards are, which affects rendering time
3. How many subscriptions Tableau will process every day
4. User concurrency consuming reports on server
5. Load for ad-hoc analysis through desktop attaching to published data server extracts
It is the combination of all these things that will dictate the number of cores you will need. 100 concurrent users consuming a simple report that connect to a live database may put the same load on the server as 10 concurrent people looking at complex dashboards against large data server extracts.
All I can do is offer you some perspective based on our deployment. I have almost 14k users that have access to Tableau Server. But typically only 200-300 unique users are on the server each day. On average, measuring concurrency at 15 minute intervals during peek hours, I'll have 10-20 people active on the server with peaks of about 40. An average day is 3-4 hours of processing time for interactive usage. Almost everything we do is extracted. We extract around 25 billion data points (rows x columns) every day, which takes a total processing time of around 24 hours each day. There are about 400 reports that get processed and emailed every day from the server taking about 5 hours of CPU time. Finally, I have 135 desktop users that use data server extensively. I don't have a good way to measure the load they put on the server.
My setup is 16 core high availability. Largely it is the extracts that create the need for a 16 core deployment. But if you are creative, you could offload your extract creation to other machines using the extract API.
Please mark the answer as CORRECT & HELPFUL if it really helps you so that it can help others as well