First, you're gonna need to read the manual. It talkes about dimension vs. measures and why what it's doing is correct.
Second, keep your data pulls concise (get only the data needed & nothing more) and work from an extract instead of live data. There are other data-tuning things that can be done (like aggregating and optimizing the extract) but I'm not experienced enough to explain those varied methods and it's in the manual. The initial pull of data could take time but once it's in extract form the visualization speeds should dramatically increase.
Finally, don't expect every single visualization to take 1-2 seconds. That's unrealistic. Even with a smaller data source, if the dashboard/worksheet is complex, many filters, many conditional expressions, etc., it's gonna take processing time.
I've worked (and am working) for companies with very large data sets and they're doing just fine. So it depends on how well the database is set up, indexed, statistics/optimizer, network status, ODBC vs. native connection, SQL coding, RAM, CPU version, OS, custom fields, number of operations on the dashboard, etc. Without a specific scenario (with plenty of details!) we can't give you specific solutions.
Message was edited by: Toby Erkson: http://community.tableau.com/docs/DOC-1251