1 of 1 people found this helpful
Hi Jalal. We ran into a similar issue in that there was a timeout limit set on our data warehouse resource queue of 30 minutes, and if it ran past this, then the data would either be partial or empty. There are a few options to address this.
- Can you ask your DBA's to increase the timeout at least temporarily so you can get all the data? Or run with a higher privileged id or queue?
- Can you set the extract up as incremental with a date? An incremental essentially picks up where it left out at.
- What version are you running? If it's pre 10.5, then can you upgrade so that the new Hyper data engine can be leveraged? It's faster.
Thanks for the advice. I'm currently running the latest version of Tableau, so I don't think there is much efficiency I can get there.
I feel you are probably right that there is most likely an upper limit timeout set on the DB itself, currently I'm running the same query in my workbench to see if it will run for 2017 & 2018.
Right now the company doesn't have any DBAs.... so it falls on me to troubleshoot my DB issues, but I'm not a DBA so it can be a bit difficult sometimes.
I'll have my CTO check to see if there is an upper limit set/priority.
Also I haven't really used incremental updates, I should definitely look into for my reports that use this query.
I'll keep the thread posed on any progress.
This should not be an error on Tableau correct?
Doubtful that it's an error with Tableau. You can have a look at various places to see if you can up the timeout. Here's one that I found. How can I change the default Mysql connection timeout when connecting through python? - Stack Overflow
COnfirmed that there is some issue with the configs, priorities, and resources being assigned.
CTO and Director of DevOPS have been trying things out, to no avail. But the issue is some sort of limit/timeout on the Fetch.
Thanks for the advice Jeff, they said it gave them a starting place..