1 of 1 people found this helpful
Can you add more disk? Disk is cheap, and you should be able to have a minimum of 1 TB. You most likely don't want to manually delete files from the extract folder, this could corrupt your system. Also, I'm not sure about the disparity of disk stats. It could be the 22 GB is compressed, and 70 GB is uncompressed, or it could be you're just looking at 1 site. In any case, just add disk.
Extending the disk volume is the only applicable solution here?
Because after I upgrade to 2018.2.2 last week, the size of the hyper file in temp directory is keep increasing and it is almost 300 GB now. I'm not sure whether it may increase further or will there be any cut-over.
Any idea how we can resolve and control it? Even if we increase the volume today, again after few days or months it may reach the threshold if the size gets growing.
Can anyone help here to understand?
If you saw a major spike in disk space usage after upgrading I would advise to open a support case.
As Asgar mentioned in his original post, he ran the "cleanup command" but was curious if it was run with both Tableau Server in the *stopped* state as well as when it was running.
The other thing worth checking is the "Background Tasks for Non-Extracts" admin view. If the only jobs failing are say "Subscriptions" or "Data Alerts" it's not a major issue but really want to be sure that any type of "Reap" job is not failing on a consistent basis.
Lastly, when you first are starting with Tableau Server, folks have not had the chance to create a lot of extracts. However, overtime, users will most likely continue to create extracts and that will result in the need for more disk space and if you monitor the size of the nightly backup file it will give you an idea of when things are growing. It's easy to forget that at some point an "archive" strategy will need to be in place to archive then delete content off of Tableau Server. Which it's not an easy thing in that part of Tableau's strength is giving users less restrictions so they can create the content they need. But then some amount of governance needs to come into play so maybe a lot of "duplicate" extracts can be avoided. Starting with 10.4 you can create "certified published data sources" but I've seen where some end users really run with the freedom and create 10GB or larger extracts. There of course could be a legitimate need for that but if all users begin to think that "10GB" is no big deal for an extract then I could see the nightly backup file growing quite large. The "Server Disk Space" admin view can help identify really large extracts that may have outlived their usefulness.
Thank you for the reply Mark. The final option would be going to Tableau support.
Asgar Mammadli Can you please confirm still you are facing the disk issue due to the hyper file in temp directory.