1 of 1 people found this helpful
From the first look it seems that you are using context filters in a different way then it should be
Your should put everything into context, and then apply TOP n things.
That is very useful as now I understand better how it all works BUT do I need to add all my various filters to context and how does that impact the other dashboards in the workbook?
Well, it's easy. Tableau 7 uses a two pass filtering. First the context filters are applied. So if you want to see the top 10 products by sales, you should first apply other filters like product category or customers. Then comes the non-context filters, which are filtering the output of the context filters.
Step 0: So, you have a dataset.
Step 1: Filter by the context filters, which will reduce your dataset to only relevant values. This will be the basis for the top calculations. Like you will filter only for the relevant product group and customers.
Step 2: If you have non context filters, then they will applied on the results from the previous step. So you will apply the Top filters here.
Behind the scenes after step 1 you will have a temporary table or an intermediate extract file, and Tableau will apply step 2 on these things. Personally I hate this behaviour, but yes, it works like this. There would be much more elegant solutions.
If you do not use context filters all filters are used in the same time. You will do the top 5 AND the product group filtering, so if top 5 products are not in the selected product group and the filters are logically AND-ed together you will see an empty dataset.
How it works with your different sheet? It is also something what I don't like, but all sheets are calculated individually, so there are no real connection between them (except global filters). But the same logic is applied, first the context filter then the other filters.
@Tamas - Tableau actually has several layers of filtering, here's Joe Mako's post on that: http://www.tableausoftware.com/support/forum/topic/question-how-create-product-velocity-calculation#comment-39603.
@Julia - An alternative to using context filters as a solution is to use table calcs. This has the advantage of being more flexible, the disadvantage is that it requires more data to be moved from the data source to Tableau for processing and that can be slower for more data. The volumes you seem to have, though, wouldn't be any problem.
@Tamas - Tableau actually has several layers of filtering, here's Joe Mako's post on that:http://www.tableausoftware.com/support/forum/topic/question-how-create-product-velocity-calculation#comment-39603.
Uhm, good to know, excellent post
The problem is with these, that imagine that you have 20 billion records, and you want to create to top 10. You must create an extract or temporary table, which could have the same 20 billion records. It will takes ages.
However, if Tableau will be smart enough to create inline views instead of temporary tables in such cases, then the database engine can process it much more effectively. I played with some ODBC drivers to deny the temporary table feature, but instead of inline views tableau fall'd back to extracts - totally useless with such amount of records.
I would need more control on the database side like in the good-old 'traditional BI' tools.
i'm bringing up an old thread but i think that it important and maybe the good guys @ tableau will do something about it.
right now i have the same problem as Julia but in our case we have a rather large table, ~5 Billion rows and the way tableau send these Top N queries to the DB is terrible.
in order to a top n calculating it does an inner join to the same table without any where clause (you can see a detail example of the type of query here) you can imagine that doing a scan on a 5B rows takes some time even on a powerful machine.
doing the same top N query in the database takes around 9 sec (for the last 10 min of data while in tableau it takes several hours.
the work around to add the filter to the context is not working very well since a temporary table is created inside the DB and again it takes a while to get the desired answer.
also, an extract is out of the question since we need to look at live data, last year i suggested a sliding windows extract that can really help in a situation like this.