I'm assumming that the amount of time it takes to build the extract locally using desktop is considerably less than 4 hours. In my experience, when I know that I can build an extract on my desktop much faster than by the more powerful server, I explore a few options:
a. If I can get the server admin to use a desktop license from the server to create the extract, the time it takes for that extract to complete will help isolate whether the issue is from server overload or network latency.
b. If the above extract can be made on the server close to my desktop observed times, then it likely is a server capacity/scheduling issue. I make sure that the time slot chosen for the refresh is one that isn't being used by other extracts. I want a slot that has no or very few other processes contending for server resources. If that doesn't fix it, then it becomes a server capacity and settings issue that needs some tweaking.
c. If the desktop extract performed on the server machine takes just as long as it does by the server, then the issue is in your network. Solutions to this issue depend on what options you have within your environment. For instance, it might just be a throttling done by your network admins that could be solved through some network settings. Or, if that isn't an option, then perhaps duplicating your data to a server closer to the Tableau server and then extracting from there might reduce your time. Or, you could offload the extract update process to your local network server, have that create the extract and then republish the extract using tabcmd to the Tableau server.
First figuring out what is causing the longer run times on server than desktop, and then depending on what your company permits as far as solutions will lead you to an approach that while not as simple as using Tableau server directly, will minimize the update process in your environment.
Thanks for the suggestion...
We have some oracle replication at the same server but it took 45min to completed the task but with using Tableau extract it need 4hrs.
I'm just wonder if I need to extract the data from 9 different database which located in different country, what is the best solution for that?
You're note didn't mention that you've done the isolation testing to eliminate network bandwidth as an issue. Do that first, IMO, as more often than not, network connectivity is the culprit.
After ruling out bandwidth, if you are using Oracle, and the same SQL to build of a table only takes 45 minutes, then try building a materialized view on the Oracle server, then use that as the source for your Tableau extract. (I'm assuming you don't have any issues with respect to network lag between that Oracle server and the Tableau server). Since there is no custom SQL from Tableau in this case, this will be the fastest data transfer that your network will permit.
You could also look at the server logs (either Oracle or Tableau) to see the exact SQL that Tableau is using to execute the extract. That could illuminate ways to improve the build time (like introducing indexes or constructing materialized views, use/with statements or other SQL optimization constructs). If you have a tool that analyzes SQL for optimization, you could run it through that tool to see what suggestions are brought forward (just recognize that some suggestions might not be implementable through the Tableau custom SQL dialog box, so you might have to get creative to find a solution.)