Jupytab author here, thanks a lot for the interest in my project !
Your usage is correct, however your dynamic2_df method does not seem to update your object. It only returns a dataframe object that has been created once.
You need to trigger the method that generate the dataframe itself, tableau extract will act like a "scheduler" that invoke your python code.
my_df = compute_a_new_dataframe_with_up_to_date_datas()
tables['net_object'] = jupytab.DataFrameTable('Network_Objects', refresh_method=generate_net_object)
Please tell me if it helps !
Thank you for the explanation above. I think I understand now. Once the API call to the particular notebook "cell" has been made, then it is up to the cell (not the rest of the notebook) to perform the necessary refresh methods to update the dataframe. I originally thought that during the API call from Tableau (via the jupyter kernel gateway) , the entire notebook (from top to bottom) will execute.
Yes exactly, for instance this allows you to have an heavy compute code at the first notebook execution, then fast refresh for some tables using dedicated function that update part of the dataframe (like ... update data for the current day only).