I have looked into both the tabcmd which you have mentioned but i didn't get any clues how to achieve refresh dynamically.
Thanks for your reply.
If the extract depends on only one data flow, you can execute the tabcmd directly at the end of it in informatica (I assume you have some kind of execute script or shell task in Informatica).
If the extract depends on multiple data flows/tables, you might need to have a table giving the stat eof your datasets.
You'll then need to have a cron job, i.e. some kind of job that will execute regularly, that will check this tabl and trigger the command if all the dateset needed are available and up-to-date.
Facebook has documented their way to do that:
This is just an example. Our implementation for a similar issue is different but based on the same underlying idea.
Hope this helps.
What Damien said.
We used to call tabcmd at the end of a SQL Server job that refreshed our extracts, but I got tired of having to go through the DBAs to make changes, plus we had to upgrade tabcmd each time we upgraded Tableau Server.
So instead, now I've set up tasks that run on my Tableau Server hosts, which run queries against arbitrary databases whose data describes ETL completion times. It checks a single datetime value against a local text file containing the last refresh date, and if they differ, calls "tabcmd runschedule _____" where _____ is the name of an entire Schedule on Tableau Server. Things stay in sync, we do the minimum amount of extract refresh-ery, and I don't have to impact two systems to make one change.