Forgive the title, it's likely generic for the question I'm about to ask.
We're running a POC to white label Tableau into our products. The question stems for the sensitive nature of our client data and how we store it..thus the client or end-user experience for when they access it.
We want a user to log into our portal and authenticate through OKTA. Upon doing so, we will want to dynamically bind or pass parameters to the data model. Understand we keep client data in separate Postgres schemas. All share the same table/view structure and we've been able to simply pass the schema associated with user to offer this personalized experience in the past.
We also report on huge volume of transnational data. So if I were to cache a model... the model would have to have a dedicated repository, right? or generate 1 cached instance, right?
My question is if we can utilize any of the caching features in tableau server for any of the data tools to achieve what I describe below .
If you think about maintenance of a data model, we'll have the same model/reports for each client, but will pass a different schema names at runtime.. (but runtime can get long)... Client_1.reporting_view, Client_2.reporting_view... gets passed into a model depending on which client is logged in. We only need to maintain 1 model, for example, by passing this data connection dynamically.
From a technical perspective, when considering the tableau server and its features.. is there any technique or features we'd use to potentially cache all possible datasources for 1 model so they're available upon client login?
Similarly, with the data tools, is there any way to throw into memory all these separate schemas?