Could you please go through the below link for the technical specifications of Tableau environment.
It provides you the entire sizing information for different deployments.
Hope this helps!!
Hi Sogoli --
Since the Fast Data Engine is column-based, the number of columns you're dealing with is as important as the number of rows -- just something to keep in mind. The columns' data types and cardinality also impact the extract creation. In essence, there's no real answer to your question because each dataset is unique.
That being said:
The extraction process uses the most RAM during the "sort" phase (towards the end the extract creation process) and generally that part of the process will go faster if you have a more RAM so we don't need to "swap" rows/columns to disk.
Unlike other in-memory data engines, we don't need to fit the entire dataset into physical RAM to do all this work In other words, you can extract and do analysis against large datasets on hardware that wouldn't possibly support the same work other products you may be familiar with. WIll it be fast? Nope - not as fast as if you have a machine with more RAM. But it'll work.
Hope this helps a little.