Hey Rahul, 10-15 TB?? What is this computer?? Can we see some specs on them?
Even if it takes just 15 TB to make a backup, I can imagine the backup taking ages and the restore process for that would be so long. Are you able to remove some unused extracts first? The server admin views will tell you which ones aren't active anymore. I'm sure there are lots mores issues with having a single 15 TB file too...
Hi vien ,
The machines are 8 cpu * 128 gb of mem. The extracts in total are about 10tb not one big extract. You are right old extracts could be removed .
It's a multi tenant ( using sites ) architecture .
**Sent with mobile. Please excuse brevity & typos**
Hi Rahul - where are you calculating the 15 TB estimate? If it's based on your current server, we should look at your extracts themselves, perhaps we can optimize them to reduce the size requirement. As you note above, removing unused extracts will help. And, in general, Tableau compression rates improve over time.
Of course, live connections eliminate the need for extracts altogether. It will be data-source dependent...but another option.
If extract sizes are prohibitive, one of the most overlooked and space-reducing techniques I've found is to look for extracts being used locally in workbooks, and using the Hide Unused Fields feature to remove all fields that are not used in the workbook. This feature is a bit hidden, but in my experience has usually resulted in about an 80% to 90% reduction in size. It's an easy win because it doesn't entail changing the structure of the workbooks in any way, and there's really no defensible reason that a workbook published to Tableau Server should contain extract data that it does not make use of.
It would be great to have an automated way of identifying workbooks with this situation, and trigger alerts to the admin/users, but I don't have one yet.