For a potential 10K users you should do more than counting costs for servers and software licenses, your firm should create a vibrant community of business analysts (that 100 data visualizations experts), in this regard please follow Andy Kriebel 2013 presentation.
Hope this helps.
How many concurrent users?
What will be the data source? Make sure your data source can handle the load.
The 8 core server is 100 concurrent users.
2 of 2 people found this helpful
The answer to what is right for you will depend on a lot of factors like:
1. How much data you will be extracting every day
2. How complex your dashboards are, which affects rendering time
3. How many subscriptions Tableau will process every day
4. User concurrency consuming reports on server
5. Load for ad-hoc analysis through desktop attaching to published data server extracts
It is the combination of all these things that will dictate the number of cores you will need. 100 concurrent users consuming a simple report that connect to a live database may put the same load on the server as 10 concurrent people looking at complex dashboards against large data server extracts.
All I can do is offer you some perspective based on our deployment. I have almost 14k users that have access to Tableau Server. But typically only 200-300 unique users are on the server each day. On average, measuring concurrency at 15 minute intervals during peek hours, I'll have 10-20 people active on the server with peaks of about 40. An average day is 3-4 hours of processing time for interactive usage. Almost everything we do is extracted. We extract around 25 billion data points (rows x columns) every day, which takes a total processing time of around 24 hours each day. There are about 400 reports that get processed and emailed every day from the server taking about 5 hours of CPU time. Finally, I have 135 desktop users that use data server extensively. I don't have a good way to measure the load they put on the server.
My setup is 16 core high availability. Largely it is the extracts that create the need for a 16 core deployment. But if you are creative, you could offload your extract creation to other machines using the extract API.
BTW, I'd suggest you remove the price tag from the original post. Generally Tableau wants you to comment on pricing via their sales process and not on the forum.
"you could offload your extract creation to other machines using the extract API."
using one of the ETL solution published here third parties tools able to export/import data to/from Tableau Data Engine
Edited my first post to include info on desktop users.
5 of 5 people found this helpful
Agreed with Mark. Extract refreshing is the most CPU-intensive operation that Tableau Server performs on our instance, and it can impact the user experience if you are running too many, too often, for too long while users are on the system. If you refresh them during off-hours times, you will do much to allow your server to support more users.
Some stats from our deployment, representing peak usage:
25 users / minute
150 users / hour
750 / day
1500 total active users
8,776 / day
500 / hour
2,100 extracts refreshed / incremented daily
750 of these refreshes take place during business hours
750 of these are sent daily, primarily just before business hours start
We are running 5 physical hosts at 16 cores apiece in an HA configuration, with two the workers dedicated to Backgrounders (8 processes on each for a total of 16). The other three are responsible for all the other Server processes, with the Primary running 1 VizQL, 2 app servers, and 1 Data Server process.
CPU utilization on Primary peaks at 25%, average of 10%
CPU utilization on Worker 1 peaks at 40%, average of 10%
CPU utilization on Worker 2 peaks at 30%, average of 20%
2 of 2 people found this helpful
Well Brent, you won't get an answer here because it's not simple to figure out. What the others have offered is more than I can give and I agree with all of them 100% Depending upon the success of your venture, the way in which the company does reporting can change and thus bring about more challenges, for better or for worse.
One thing I recommend is knowing that Tableau is more capable and different than past, typical BI tools like Cognos, Business Objects, Microstrategy, etc. (I've worked with a few of these in the past). So don't think that you can compare Tableau equally with them as I don't think it's quite that simple.
We are successful with a single, 8-core server (core-based license). We have over 80 Desktop licenses (and growing weekly!) peppered throughout our global company with just over 2,000 Tableau Server Users (and growing) with 10 concurrent users peak and in a growing high-extract environment reporting against Oracle, MS SQL Server, IBM DB2, Excel, and text files. While the official BI tool is Cognos we are getting program/project managers running away from that and into Tableau due to speed and simplicity -- those hard-to-measure intangible benefits. Plus it's so nice for them to not be reliant on IT for report development -- one less bottle-neck.
If you're serious about Tableau then you NEED to talk to a sales representative. They can get you pointed in the right direction. Support is really good at all levels with Tableau Software, seriously. If you have ever had to deal with IBM < ahem > "support" then you'll find Tableau Software support just vaporizes them outta the water!