1 of 1 people found this helpful
I think this problem would be best solved with adding a timestamp field with a default value of today so that the database records when each row was inserted.
Then, you can just point the extract at the table, and if you want to analyse what tools were there on a given day, you can just filter on that timestamp.
If you're concerned about capturing when tools are removed from the toolset, you could add an endDate column and populate that with the date and time the tool was removed from the inventory.
Ultimately, this seems to be a problem that is best solved in the data source (not sure what you mean by "running out of blank rows"
I followed method which is very in the above web site and very similar to what you said. It worked on Tableau desktop perfectly, but this method does not work when I push the file to tableau online cloud. Currently I don't have a database server that I can publish on daily basis so I was looking for workarounds in Tableau.
1 of 1 people found this helpful
OK, I think I understand what you are trying to do. I figured out a workaround, which might help.
You will need:
The basic flow is like this:
Tableau Prep currently has a problem with using the same file as both an output and an input in the same flow.
We get around this by using Excel Get Data to read the Output.csv into a secondary Excel file and use that as an input to the flow. That way, each time the flow is run, it can stack the input data on the output data from the previous run using Union.
The first thing to do is to set up the Output file. Add a calculated field called "Inserted" and set the calculation to NOW().
Output to a file called "Output.csv". You can see the timestamp for "Inserted" has been added as a new column.
Now create a new Excel file. I called mine "BatcaveInventoryMerged.xlsx" and use Data>Get & Transform Data>From Text/CSV> to set up a connection to Output.csv, so that you pull the output from the flow into the new Excel file and it looks something like this:
Now that you have this file, you can use it as an input to the flow so it can be merged with the new data tomorrow.
I called the connection to "BatcaveInventory.xlsx", "NewData" and the connection to "BatcaveInventoryMerged.xlsx", "PreviousOutput":
The PreviousOutput source has likely imported the Inserted field as text, so it's important to change the type to date/time before union with NewData. At the same time, before the union, you need to add the timestamp to the new data:
After those simple operations, you do the union and then remove the field called "Table names" (this is created by a union step, but you don't need it here). Then output to Output.csv again.
On Day 2, BatcaveInventory.xlsx looks like this:
After you run the flow, Output.csv looks like this:
Then BatcaveInventoryMerged.xlsx will look like this when you refresh the query:
Then you can use BatcaveInventoryMerged.xlsx as the data source for your Tableau Workbook.
The only real problem with this that I can see is that you have to refresh the query in the second Excel file to pull in the latest output from Output.csv. However, if you're handy with VBA you could easily add a Workbook_Open() event to the workbook to simplify it by one step
Let me know if this works for you.
Thank you very much,
I worked on your first solution and It works. I pull data from the MS SQL server database with timestamp and make an incremental refresh to load it in the tableau online. Previously I faced challenges while making incremental extract on tableau online previously but I was able to overcome that with tableau tech support.