I have a dashboard where a user can select from one of 150 attributes (individual columns) using a parameter.
This parameter drives a formula that shows the corresponding values for that attribute in a filter so that filter selections can be made on the new field.
Right now I import this as one massive table. Would it make sense from a performance perspective to split these attributes out into a separate table, perform a second extract, and join the field values in Tableau.
Keep in mind that I am working with a quarter billion rows and the vis shows distinct counts, etc. so there's really no way to shrink the data any more than I already am.
I'm going to have to do a lot of redesign in the workbook if I go to more of a star-schema approach with multiple extractions; each dimension table as a separate extraction with the main base table containing only keys and aggregate values.
Before I do this I'm looking for guidance from anyone who has done something like this in the past.
I see in every case in testing that the extract is faster than querying the database but I'm wondering if the multiple extractions with the joins done natively in Tableau will give me the performance boost I need to justify a rebuild.