2 of 2 people found this helpful
Fascinating stuff, all.
In keeping with the physics approach (I'm not a physicist, but I've read a Scientific American issue or several), I think of the Tableau canvas as a holographic 2D projection of a complex multi-dimensional space consisting of a potentially large number of dimensions with mysterious interactions between them.
On the one hand, Tableau's canvas is a simple drag-and-drop space that responds to direct actions on the data objects - Dimensions and Measures - by generating context-appropriate visualizations of aggregated quantities within a fixed-frame organizational context. This is its original design space, and where it's a beautiful product (with little room between how good it does the job and how well the job CAN be done. (Joe: you asked me for a description like this years ago but I wasn't able to express it then. Here's my stab at it.)
Beyond this initial design space, it is, as Matt pointed out, a universal drawing canvas where one can, if one's motivated and clever enough, coerce Tableau into drawing pretty much anything. Which is all fine and good. If one needs a universal drawing canvas that requires the mastery of a broad and deep store of arcane knowledge that's not available in any canonical form.
Another way I think of Tableau's canvas is as the visible wall in Plato's Allegory of the Cave. In this conception mere mortals-the vast majority of Tableau users, are only able to see surface projections of a richer, deeper reality that's hidden from them, the workings of which they can only infer from the effects on the shadows of hidden actors. Trying to understand what's going on behind the curtain is an almost Sisyphean task; whenever the curtain is coaxed aside for a glimpse into the works it mostly shows other curtains. One way I like to think of what Jonathan has done to figure things out is that he's been doing XRay diffraction with a miner's helmet and a bed sheet.
I've been trying to come up with a decent visualization of Tableau's data processing mechanisms for a number of years, and have been frustrated with my inability to capture the space in a way that makes sense - thanks Rody and Simon for yours, they're very helpful and informative.
On the one hand Tableau is a beautiful, elegant tool that makes simple, straightforward data analysis simple and straightforward. But on the other hand it makes pretty much everything else outside of its simple original design space much, much harder to puzzle out and effect than it could.
One of the great tragedies of the past ten years is that Tableau started out on the good path and then lost its way, becoming increasingly capable but even more increasingly inscrutable as functionality was added, often seemingly bolted on without real consideration for the whole space. So we're left with a canvas that's hobbled in its utility by the baroque complexity of the functional mechanisms that access, manipulate, and present data, and by Tableau's continuing lack of any real documentation on what it does, how it does it, why it does things the way it does, and ultimately how to conceive of the domain of possibilities and generate implementations from that domain.
Finally, one of the questions I need to ask, of myself and my clients is: why would you want to create things in Tableau that require deep, deep knowledge that takes a long time to acquire, are difficult to implement, and can only be supported and/or replicated by the very few when it's simpler, easier, and more transparent to create them with some other, better, tool or technology?
Thank you Chris!
I very much agree with many of the points you make here. I love that Tableau provides a lot of "How to's", but, as you said, there isn't much documentation on what Tableau is actually doing, and how it does it. Much of that knowledge has been acquired through conversations and debates amongnst Tableau users, or piecing together many "unrelated", documents (And of course our brute force-no quit attitude to see Tableau behind veil).
2 of 2 people found this helpful
Thanks for this…a very interesting post, and one that I see both sides.
Now I’ve only been using Tableau for just over a year, but have spoken about the ‘early’ years with several ‘early-adopters’. So with that caveat! I think when Tableau started out, it very much saw itself as a pure visualisation tool…”We are a Visualisation Tool, not a (E)TL)/data-prep tool”. Which if kept like this would probably have meant a far simpler tool. However the market (thinking the kind of the data-prep that can be done, albeit with code, in QlikView – again I caveat this with taking this from QlikView users I know, and not used it myself), and customers who want/expect a data one-stop-shop, have meant Tableau has had to add in more and more features (such as LoD expressions) that allow this data-prep-type work to be done. This has inevitably led to an increase in the complexity of what is going on ‘under-the-bonnet’, but as you say it has become increasingly capable.
In a way Tableau is a victim of its own ‘Easy-to-use’ UI, meaning that (from my training experience) users who have little ‘data experience’ are users (non-IT/database people wouldn’t even attempt to use Cognos!). These users just want things to work, just loading in their data as it comes to them and expecting Tableau to have the features to allow them do things, which (in Tableau’s “We are a Viz Tool only” days) would have been advised to do in the ETL to get the data in the ‘right’ shape.
This extra functionality/complexity has been X-rayed (nice way of looking at it btw!) by the ‘non-mortals’ which as well as giving a better understanding of ‘what’s under the bonnet’, has also led to them being able to exploit this knowledge (I’m thinking such things as Order of Operations, Cache Levels (which was brilliantly exploited in the BlackJack Viz), ‘Ignore in Table Calc’…many of which are undocumented) to use the software in even more imaginative ways. So this is where (apart from the lack of documentation!) I see this as a plus.
However from a training perspective it’s becoming increasingly complicated. I’m always caught in 2 minds as to whether for the ‘average’ (and I mean that in a good way!) user I’m better off explaining the ‘truth’ (or what we know of it) at the risk of total confusion, or just showing them how to do something (an example would be using a Table Calc filter – Is it better to just show them how to do it, or spend 2 hours going through the ‘Order of Operations’ – seeing as many of the people I train come from an Excel background and I only get 2 days with them)…I opt for ‘sizing’ up my audiences ‘data experience’ and playing it from there. With all that said, it still fairly easy to get people to what I call 70% Tableau usage (by this I mean you have the knowledge contained in the major training videos, or Dan Murray book) and for many, this is more than enough to massively improve their data-life.
I’ve recently been fortunate enough to acquire Alteryx, and is a perfect marriage with Tableau. I, now, very rarely invoke that many of the very complicated functionality as Alteryx makes it so easy to go back and re-shape the data, create new scaffold/aggregated columns…etc. By way of example, we have a footy table in the office (like any self-respecting IT company!!) and I’ve created a Tableau dashboard. One of things people wanted was Top Winning Streak (both in Time and Number of Games)…Pre-Alteryx this was a very complicated calculation (here how it’s done http://community.tableau.com/message/420189 and then I have to bring back just the top one!)…with Alteryx I code the Win Streak into a Column, and then in Tableau it’s just a MAX on this column. To me this is the perfect example of what Tableau initially set out to do (i.e. create the complex logic in another tool, and visualise it in Tableau) vs. Just load the data in raw and expect to have functions that let you do this kind of thing.
…thank you for a very thought provoking post!