We have gone with Tableau for a reporting solution and are pretty happy with the general features and functionality.
Our project is is not describe as Big Data by any stretch of the imagination. - original fact tables were only about 2m rows with 12 dims and ~200 measures.
We opted to go for live connections to ensure that we could leverage the security at the database level by using AD groups and SQL Server's ISMEMBER() function to handle security.
Our dashboards do have a number of calculations that have to be performed in Tableau such as ratios and calc measures for KPI colour coding.
We found that with our dashboards that were only looking at separate periods of information, performance was not acceptable (>40secs) to load. This took us down the root of aggregate tables which meant us losing a number of dimensions in order to reduce the Agg fact tables down to ~200k rows.
Currently performance is around 15s to load this data, which still to me seems too long.
What would be great is to have standard Tableau test that is performing some fairly complex calculations of a medium sized set of SQL data (generated via a tally table or other). If we were to get a feel for how long this kind of "standard" test is taking in other environments, we could at least start to understand if our environment is way under-performing in comparison to others.
We would be interested to hear anyone else's experience with performance based on medium to large sized datasets. I understand that there are some good performance improvements in v8 (which we hope to move to in the next few months) and some features around performance monitoring.