Maybe you could evaluate an intelligent brute force approach, for example adding SSD disks on Oracle machine or a big RAM disk and move tempdb there.
good thoughts Christian, but we've already added SSD in our highest tier of storage, and are running on 8 cores with 64GB memory. We just had a meeting with Tableau technical folks and they are going to get back to us with a better way to figure how much RAM is needed.
But the point still is to NOT use a standard relational database and use an analytic database engine instead. We could then forgo the billion rows as an extract and instead, connect live possibly.
Anybody out there using an analytic backend with Tableau?
thanks for the insight Christian
excellent. thanks Christian!
Keep the feedback coming the rest of you!
I'm using Tableau with a SSAS OLAP cube (among other things) on the backend. Performance has been reasonable considering the size of the cube (fact table has > 2 billion rows) and middling optimization on our cube. But be aware that many of Tableau's features don't work with cubes, or don't work the same way, so you'll need to do a lot of calculated members either in the cube or in Tableau to get some of the same capabilities. I'd suggest taking a look at the Tableau workbook that uses the AdventureWorks sample cube to get an idea what it's like working with cubes in Tableau. No experience here with columnar stores, but I'm interested to know how they work with Tableau.
Thanks for the feedback Sean and great insight in that Tableau handles data from cubes differently. We aren't currently accessing any cube data (we only have Cognos Transformer cubes).
I will definitely keep this thread up to date on our research and experience with an analytic database backend (NoSQL, NewSQL, columnar, etc..). We have quite a laundry list (probably too many) of vendors that we want to try out.
more to come soon....
This is a great question, and I'm curious what solution you landed on for your back end data solution.
Hi Erin (and everyone else on the thread! )...we ended up trying the Community Edition of HP Vertica (upto 1 GB storage, 3 node cluster) and were blown away by the performance. So, we ended up licensing 2 GB of HP Vertica and are moving all TDE sources to HP Vertica as live-connects in Tableau. The performance is astounding.
Good luck everyone!
Did you encounter any crash on Vertica? How stable is the "rig"?
two analytical, high performance, column store and/or in memory database made in EU
1. www.exasol.com // Germany commercial TPC-H - All Results - Sorted by Performance
2. www.monetdb.org // Netherlands opensource http://homepages.cwi.nl/~mk/ontimeReport
Hi Christian - we have not encountered a crash in Vertica in the ~8 months that we've had it. We are still running on one node and getting ready to expand to a 3 node cluster...fingers crossed
I'd have to ask our DBAs/engineers about the "rig"...I manage the team and am less hands on.
thanks for the tips on the 2 EU products...can't wait to check them out....you have any experience with either of them?
Well, not yet, I would like to test monetdb + Tableau with 350 million records. I shall keep you updated.
Exasol do have a rep. office open in the states, in San Francisco.