FYI: This workbook is not maintained. But, the data source it relies on is maintained. You can get it, and others like it, here.
We've instituted a quarterly "cleanup" process on our main production Tableau Server instance to remove content that is no longer being used. The goal is to get rid of anything we simply don't use with minimal disruption to users and business impact. To do this, I built a workbook that allows users themselves to sort through what content is theirs and see how often it's used, with an Action to allow them to access the page to delete it. Based on this list, we automatically backup then delete everything in it if the users do nothing to preserve it. It's worked pretty well for us so far!
The criteria for us is, any workbook or datasource that:
- Was accessed fewer than 3 times within the last 180 days
- Was last created or updated more than 180 days ago
- Does not belong to an "Archive" project (projects we've created to specifically preserve older content)
The process goes like this:
- From the "Stale Content List" sheet, filtered to content that needs to be removed, export the csv data. Preserve this file until the entire process is complete.
- From the csv, grab the Owner Email data and email all of the users with stale content, informing them that they need to:
- Click the link and look at the viz
- Access the content they wish to keep until it falls off the "chopping block"--otherwise it will be deleted two weeks from now
- Send a nice reminder a few days in advance in case someone didn't read their email
- At the end of the two weeks, in Tableau Desktop, filter the "Stale Content List" sheet to just the repository URLs from the csv you exported. This ensures that we won't include content that may have popped onto the list that users did not get the two week warning for.
- Still using the filtered view, export the csv again. This is the final list of content to remove.
- Copy the data into the StaleContentTabcmd Excel workbook. The rightmost columns will build the tabcmd commands you need via formulas. This part generally requires a little tweaking to make sure you get the columns in the right spots--make sure you don't overwrite the formula columns.
- Copy the command columns out into text files. These are designed to run in Windows Powershell.
- Make sure that the folders referenced in the commands exist.
- Test a few Archive commands to ensure they are working properly, then run them all.
- Double check that these all ran successfully. Typically a few will fail because of weird characters and such.
- Test a few Delete commands (ideally on obviously garbage content) to ensure they are working properly.
- Are you sure your Archive commands ran successfully? Doesn't matter, check the counts and content again anyway.
- Run all the Delete commands
- Zip up all the Archived content and preserve it somewhere in case anyone needs a file back.
- Running Tableau Desktop and Tableau Server 9.0 or higher
- You've already opened up Tableau Server's internal reporting database.
How to publish:
- Open StaleContent.twb, attached
- When prompted to run Custom SQL, click "Yes"
- If prompted to sign in to your Tableau Server, go ahead and do so.
- When prompted for database credentials, click "Edit Connection"
- Set the server name to your Tableau Server PostgreSQL Repository host, and the password to whatever you set that password to be on your server
- This query may run a long time--for larger installations I suggest taking a data extract and refreshing it on a schedule
- Set the parameters:
- The TS Content Tableau Server Name parameter to your Tableau Server Name
- The TS Content Timezone Offset (hours) parameter to your offset from UTC
- If you only want users to see their OWN content (generally recommended!), add the field "User is Owner" to Datasource filters, keeping only "Yes". You can customize this calc if you, as an Admin, always want to see all the content.
- Publish it to Tableau Server
Caveats / warnings:
- Subscriptions are not currently taken into account. By and large this isn't an issue for content we've removed so far, but I'd like to factor them in, as we don't want to auto-remove content that people still use via Subscriptions. Best way to work around in the meantime is to ensure users take a look at Subscriptions for all their content.
- The data being used isn't performant enough (at our scale at least) to use live, so we have to extract it. This means that users don't get immediate feedback if they're trying to preserve content and get it to fall off the "chopping block", which can be confusing.
- The URL action that opens the content page
- This hasn't been tested on Sites
- Version 9.0 or higher only
- Use at your own risk--no warranties or guarantees, explicit or implied.