4 Replies Latest reply on Mar 12, 2013 5:05 AM by Russell Christopher

    How do I schedule an extract refresh to run every 3 minutes?

    Michelle Baumann



      I need to convert from a live connection to our SQL server to using extracts to improve performance of a workbook.  The oldest the data can be is 2-3 minutes.  When I go to schedule the extract I am only provided with 15 minutes as an option for periodic refreshes.  I decided to create multiple schedules and stagger them, to at least get to a five minute refresh, however I can only schedule starting times in the same 15 minute intervals.


      What do I need to do in order to get the auto refresh to two to three minutes?  What happens when a user attempts to pull up the view during a refresh?


      Thanks so much for any assistance with either question!

        • 1. Re: How do I schedule an extract refresh to run every 3 minutes?
          Tracy Rodgers

          Hi Michelle,


          Currently, the most frequent an extract can be refreshed is 15 minutes. Someone might have a workaround for this, but nothing that is supported. Might be a good one for the Ideas section.



          • 2. Re: How do I schedule an extract refresh to run every 3 minutes?
            Justin Smith



            Can you use tabcmd to refresh the extract? If so, you can have a script or batch file that calls tabcmd to refresh the extract run on the machine that runs server.


            Here is a link to the tabcmd commands. It shows the login and refresh commands you would need: http://onlinehelp.tableausoftware.com/v7.0/server/en-us/tabcmd_cmd.htm#id7eedf1d3-9ae7-4119-9d68-efc5632ca75b__id4cdb3410-1c41-4dad-b1d2-306542ac9b32


            After you get the file setup to refresh the extract, set it up in Windows Task Scheduler to run at your desired time. Here is a link to setting up the Task Scheduler: http://www.hosting.com/support/windows-server-2008/create-a-scheduled-task-in-windows-server-2008


            Note I had to look into the properties to change it from once a day to more often.


            I threw together a quick batch file that logged into server using tabcmd, then pulled a pdf from one of my workbooks. Setup the task in Task Scheduler to go to that batch file. You have to setup the options in the Task Scheduler to run at X minutes. My version showed 5 as the lowest, but it allowed me to type over and put what I wanted. In my case, I ran it every minute for 5 minutes, but you would probably set it to indefinitely and not so often.


            I am by no means an expert at anything Tableau or Task Scheduler, but I will help if I can. I had no prior knowledge of Task Scheduler before reading your post, but I didn't find it to be anything difficult to figure out.



            • 3. Re: How do I schedule an extract refresh to run every 3 minutes?
              Justin Smith

              Also, I would have to agree with Tracy. The option to select any time frame would be great for the Ideas forum!



              • 4. Re: How do I schedule an extract refresh to run every 3 minutes?
                Russell Christopher

                Michelle -


                May I suggest an alternative? It sounds like you're going to be be wasting a lot of cycles constantly refreshing your extracts. Creating an extract is not an inexpensive proposition for either Tableau Server or the original data source - the fact that you see 15 minutes as the smallest increment available for a "repeat" is not completely arbitrary on our part.


                Generally, people do two types of analysis with Tableau:


                • Exploration, looking at trends, discovering "something is wrong"
                • Narrowing focus on that "something that is wrong" to figure out what the problem is and where it came from


                The latter category of work often requires close to leaf-level access to your data. The former doesn't.


                Why don't you create an aggregated extract for "category 1" work and refresh it less often - maybe hourly or daily.


                Create a second data source which connects live to SQL Server for the second category of work. Needing to spelunk around at the most granular level of the data is generally an edge-case scenario. .


                Most users will do 80% of their work using the fast, aggregated extract connected to certain reports During the "20%" scenario, users will utilize different reports in the same dashboard which have already been pre-filtered based on the exploration they have already done. Things will be slower in these "live"  reports, no doubt...but most users won't get here very often anyway. Think of if as "Master-Detail" reporting where the details are live and the "Master" has some latency.


                This is a pretty standard design pattern for us and most people are quite successful with it.