I really don't understand the concept of what you're trying to do. Can you clarify?
We currently have a bunch of Tableau Data Extracts (tde's) running on Server01, we want to migrate them over to Server02.
After migration, instead of using Tableau Scheduler to execute the tasks, we would like to use Pentaho Data Integration as well as a Tableau Data Extract Output Plugin for Pentaho to refresh this TDE's.
Please suggest me.
still unclear. But let me try.
- If Pentaho is building the extracts locally or on server01 (i.e. tde's), then you can publish these to Tableau Server using tabcmd publish tabcmd Commands
- If the extracts are already hosted on Tableau Server (server01) and you are running a clustered deployment where server01 and server02 are part of the cluster, then you can go into the TS config GUI and turn on the data engine for server02, run a save which will copy over all the extracts, and then turn it off on server01 (as a separate step).
- If you are running a single cluster deployment and just moving onto new hardware (server02), then take a tabadmin backup from server01 and then run a tabadmin restore on server02.
But PDI [Pentaho Data Integration] enabled with Tableau Data Extract Plugin which refreshed the tde's,
I downloaded PDI separately & downloaded the Tableau Data Extract Plugin separately.
How to configure those steps. Can you suggest me
it sounds like it's outside the scope of specifically Tableau as it is a plugin supplied by Pentaho. Can you ask them about config?