5 Replies Latest reply on Jun 23, 2017 6:39 AM by mohammad.walid.0

    Maximum Recommended Packaged Workbook Size

    Barbara Knowles

      Hi.

       

      I work for a fairly large university and we're experiencing growing pains.  We have some users who are publishing very large packaged workbooks - upwards of 500,000 MB in size, some are over a GB.  We're trying to encourage them to use data sources, to streamline the data, to work with us to optimize the queries, etc.  A lot of these people have refused to attend the training classes I conduct.  They think they can figure it out themselves.

       

      We'd like to implement some policies regarding the Tableau server.  Do any other server administrators have policies regarding the maximum packaged workbook size you allow to be published?  I reached out to our Tableau Rep, but he was unhelpful.

       

      I'd appreciate any advice you might have.

       

      Thanks.

      Barbara

        • 1. Re: Maximum Recommended Packaged Workbook Size
          Jeff Strauss

          Hi Barbara.  First off, our company doesn't have a limit on workbook size, but always we do encourage best practices via Designing Efficient Workbooks | Tableau Software and other whitepapers and internal communications.  That being said, we do have about 9 workbooks (I just looked at one of our sites) that exceed size of 1 gig.  In fact, one of them gets up to about 6 gig.  And they seem to render fine.

           

          In terms of important limiting factors, I can think of two.

           

          1. Do you have sufficient disk space?  In our case we have more than enough (2TB).

           

          2. Is your disk architecture fast enough to render these large workbooks?  I forget what the recommendation is (it may be 300 mb per second).  But you should be able to monitor this with either tabmon or perfmon.

           

           

          And there are ways to reign in the outliers, such as within site admin settings, there is a max quota than can be set.

           

          1 of 1 people found this helpful
          • 2. Re: Maximum Recommended Packaged Workbook Size
            Toby Erkson

            Can you please send me the list of students who think they're above training (at a university no less!) as my colleagues and I want to make sure we don't waste our time with hiring such uncooperative, selfish, non-team players.  Having dealt with such similar individuals that refuse to read constructive emails, documentation, or even take advice under anything less than their own dire situation I've lost my desire to go out of my way with such individuals.  Feel free to let them know this from a 20+ year BI veteran who's worked from start-up to top Fortune 500 companies.

             

            As to your question:

            We have 17 1GB+ objects but it's not too big of a deal because we have plenty of hard drive space.  Having enough backgrounders and RAM to accommodate refreshes helps, too.  Large sizes don't bother me...at least for now...as long as they are being used.  That's the nice thing about Tableau.

             

            Questions to ponder:

            Are extracts really necessary?  Can live connections be used?  Can you implement a required training process that people must take before allowed the Publisher site role?  Can you create your own custom admin reports to monitor size (Workgroups Database)?  Those that fall outside the maximum size can then be subject to a sit-down with the author to make sure they are following procedure and corrections made if they are not...if they wish to have their work remain on the Tableau Server

            • 3. Re: Maximum Recommended Packaged Workbook Size
              Toby Erkson

              Along the lines of Jeff's recommendation, you can limit extract refresh times.  Here's a snippet of code from my batch file I run every time I upgrade:

              .
              .
              .
              REM Longest allowable time for completing an extract refresh, in seconds (10800 seconds = 3 hours)
              tabadmin set backgrounder.querylimit 10800
              
              REM Seconds beyond the query limit before a task is canceled, in seconds (300 seconds = 5 minutes)
              tabadmin set backgrounder.extra_timeout_in_seconds 300
              
              REM Tasks to be canceled if they run longer than they should
              tabadmin set backgrounder.timeout_tasks refresh_extracts,increment_extracts,subscription_notify,single_subscription_notify
              .
              .
              .
              

               

              tabadmin set options

              • 4. Re: Maximum Recommended Packaged Workbook Size
                Saunie Burke

                Hello Barbara,

                 

                After investigating this, it looks like the ability to limit the file size of a workbook being published to Tableau Server is not currently built into the product.

                 

                The best option to provide suggestions for product enhancements is to submit an idea through our community: http://community.tableau.com/community/ideas

                 

                Providing feedback through the community forums allows other users to vote and discuss enhancement requests, and allows our development team to gauge the demand for each enhancement. Your feedback is invaluable and helps us improve our software.

                 

                I did find the below archived idea, however, because it has been archived I would suggest making a new one so other community members can up-vote the idea for our Dev team.

                 

                https://community.tableau.com/ideas/3482

                • 5. Re: Maximum Recommended Packaged Workbook Size

                  I faced a similar situation, and realized that most of our users point to the same datasource in their workbooks.

                  So when everybody extracted the same datasource and published them individually I realized the storage was consumed by same datasource multiple times....

                  That was my AHA moment.

                   

                  I got that datasource in question and extracted it & published myself (as admin) and now it occupied the space just once.

                  Later I told our users to point all their dashboards to my Gold Standard Data Source.

                   

                  And that's how I resolved it at my end.

                   

                  It might work for you as well.