Yes, there is a way you can do what you want in Tableau. First I'll recommend that you try using Data Blending instead of defining a join, and see if that gives you the performance and flexibility you desire.
To address your specific request, you can in fact mass-assign aliases in a data source using data from a different data source. The process requires Data Blending as an intermediate step, but the final result is a single data source with all of the descriptive names applied as aliases directly to the dimensions of interest. Please see my comment addressing a request posted in our Ideas forum: http://community.tableau.com/ideas/1543
Note that having a large number of aliases will increase the size of your .twb workbook file. If you save this file as a .twbx it will compress the aliases well, but will increase overhead when decompressing each time you load the workbook. You may find that Data Blending suits your needs well and does not bloat your workbook files to the same extent as aliases may.
I hope this helps,
Robert your link is broken.
My link was fine, the forum software is broken All is well now. Thanks for noticing this!
Thanks for your reply. I have tried the blending thing (too slow and cumbersome to set up) and using blending as an intermediate step thing - I think my "take away" was "too much 'by hand' work"... ie: you have to drag and drop and "set up" everything and it is not clear to be how you can save and reuse what you have painstakingly set up (by hand) if that makes any sense.
ie: what happens when the data "refreshes" and a few new values appear in the data that wern't there before? What happens if you have to change something? What happens if you "loose" your blending setup? Would I have one table containing values for each field, ie: if I had 10 fields that needed to be looked up would I have 10 tables? If I wanted to refresh a tde with the Python Data Extract API how would all of this work? Does it only work in Tableau desktop? etc, etc, etc...
I will look /try again. My question is: does my methood of joining on lookup values into one (HUGE) "table" BEFORE importing into Tableau end up giving the same results as doing any of these other methods? Or would going to the trouble of figuring out how to automate the blending mass alias assign thing have better performance?
We have purchased 10 1 hour blocks of Professional Services time:
My name is Cecelia Caban and I’m the Services Coordinator for the Professional Services team here at Tableau Software. I wanted to thank you for your recent purchase of the Expert Helpdesk Package. This allows for desktop users to contact our consultants as needed for up to 10 hour blocks that can be used in one-hour increments.
Seems like maybe you would be the guy to talk to. We also have a few "4 hour" sessions from someone somewhere (Tabelau?). My method works great. I just don't know if it is the "right" method to use...
I also have questions about the new feature in Tableau 8 that is supposed to run each sheet on a dashboard in parallel instead of sequentially. Does this only work on "server"? Does it really only work if the sheets are accessing different datasources? (I am currently using tde exclusively).
Sorry for the huge post. Thanks, and thanks in advance for any more insight you may be able to provide.
We are preparing to go ahead and call in for a one hour or four hour session. I would like to be able to get straight to the right person. Any thoughts?
Thanks for the additional detail. I'll address your tangential points first: I am strictly a Developer and I do not get involved in Professional Services contracts, but I have worked with them internally enough to know that they're a very sharp team and care a great deal about customer satisfaction. You're in good hands! As for your questions about Parallel Dashboards, I don't have an answer but I did notice your other forum post on this topic and I hope it gets answered.
As for the current topic, you bring up some important points. Yes, the process I described for importing aliases is a manual, one-time operation. The aliases will not dynamically update based on changes to an external data source. In contrast, Data Blending will reflect the latest data in all data sources but may experience reduced performance if you are blending on one or more very high cardinality dimensions (i.e. fields that have a large number of unique values, perhaps 50,000 or more). Last, a table join may increase the time it takes to create an extract, but will not increase the size of the final extract by much due to the dictionary compression used for strings.
I hope this helps,