Are you using blended databases?
If yes, could you try to create a single connection that includes all the columns from multiple tables by defining all joins between tables as part of the data connection?
When Tableau Server is installed in an environment with dynamically allocated memory, changes in RAM allocation might result in insufficient memory allocation for Tableau Server.
- Create a full backup of Tableau Server. For more information, see Back Up Tableau Data in Tableau Help.
- Work with your IT team to reconfigure the virtual environment to use static RAM instead of dynamically allocated memory. Verify that the RAM allocation meets the Tableau Server Technical Specifications.
- On the Windows Start menu, navigate to Control Panel > Programs and Features, click Tableau Server, and then click Uninstall.
- Delete the Tableau Server folder from the following location:
- On a 64-bit machine: C:\Program Files (x86)\Tableau\Tableau Server AND C:\ProgramData\Tableau\Tableau Server
- Restore the backup with the "--no-config" option. For more information, see Restore from a Backup in Tableau Help.
Lénaïc RIÉDINGER, Global Community Engineer Tableau
If you see a Helpful or Correct response, please mark it thanks to the buttons below the targeted post!
Thanks for the input.
I made some progress with debugging.
This error is so vague, didn't know where to start (no log in DataExtract.log).
After started monitoring every aspect of machines resources, In my case it happened to be when there isn't enough RAM when tdeserver starts.
How much RAM is enough for tdeserver to start? now sure.. any one know how much should be enough?
This machine has 16GB and running only one java process to generate these tde extracts using SDK.
So, I have reduced Heap allocation for the the Java process to 5GB, estimating java native will take 3GB more; this process will use 8GB. So there will always will be at least 8GB left for tdeserver and Extract API to use (assuming no other resource hungry process is running on the box).
For now I went ran couple of iterations with expected load and system handled well. But if I have to handle more, I would start with 32GB RAM and tune system to handle the same.
After generating tde files, we proceeded to used REST API to publish datasources successfully.
Use Case: Able to generate TDE files with 3Million records in less than a minute and publish using API's.
Excellent, thank you so much for sharing the resolution with us