3 Replies Latest reply on Jan 31, 2013 10:42 AM by frank.bracco

    Tableau Data Engine Error: 4: Not enough storage is available to process this command

    frank.bracco

      Hi all, any thoughts on this. When I create an extract, I'll attempt to make it's location the desktop (this drive has over 90 Gigs available). But, as the query executes and the extract is created, I get this error:

       

      Tableau Data Engine Error: 4: Not enough storage is available to process this command.

      - (MapViewOfFileEx: offset=0 mapping_size=2621440)

      - Path: C:\Users\e138656\AppData\Local\Temp\tmp3B1C.tmp\Extract\Extract\PAY_BLK.data Limit: 2621440 PageLimit: 2621440

      Unable to create extract

       

      I'm guessing it has to do with the page file/memory limits. Sometimes the page limit and data limit have higher or lower amounts. Could really use some thoughts on this because our SQL Server just cannot handle the processing on this.

       

      There is a similar question regarding default location (http://community.tableau.com/thread/110809), but it seems rather old so I thought it might be best to just to start a new thread.

        • 1. Re: Tableau Data Engine Error: 4: Not enough storage is available to process this command
          Joshua Milligan

          Frank,

           

          I remember experiencing this problem a while ago.  In our case it was because we were using a 32-bit version of Windows and the limit, imposed by the OS, was somewhere around 2GB.  When we switched to using a 64-bit OS, the error disappeared.

           

          A quick Google search reveals that this is a problem experienced by other applications, not just Tableau (see http://stackoverflow.com/questions/4314493/file-size-limit-for-sqlite-on-32bit-system)

           

          Joshua

          • 2. Re: Tableau Data Engine Error: 4: Not enough storage is available to process this command
            Joshua Milligan

            Frank,

             

            There are a few things that might help in your case if the OS is not an issue or you can't upgrade to 64 bit.  If your extracts are truly that large (2GB+) you might take a look at ways of reducing the size.  Some possibilities:

             

            1. Hide Unused Fields.  Using this option on the extract dialog can greatly reduce the size because any field that is hidden will not be included in the extract.

             

            2. Aggregate Data for Visible Dimensions.  This option may work for you if all of your views are aggregating to a higher level of detail than the underlying data.  It can drastically reduce extract size.

             

            3. Can You Work with a Subset of the Data?  Maybe you have 15 years of history, but you really only need to do detailed analysis of the last 5 or last 3 years.  Could you have one data connection that gave you a very high level aggregation of the entire history and another that pulled detail for the last 3?

             

            4. Can You Connect Live?  Tableau will work very well connecting live with all kinds of data bases.  Poorly structured and improperly indexed / partitioned databases might do better with an extract -- but consider if connecting live is an option in your case.

             

             

            Joshua

            1 of 1 people found this helpful
            • 3. Re: Tableau Data Engine Error: 4: Not enough storage is available to process this command
              frank.bracco

              Hi Josh,

               

              We can connect live, but the database server just doesn't respond well in terms of performance. I've been hiding the unused fields - that is a great suggestion for others experiencing this problem.

               

              The data is actually structured extremely well as a snowflake/star schema for data warehousing purposes. The indexes on the keys have been built out as well. Right now we only have one snapshot loaded on this database. The problem we're running into is this server [dev] doesn't have enough RAM (only 4 gigs) to perform in a timely manner. In truth, we're only taking about just under 3 million rows (but a lot of dimension columns). But running directly against the server is a nightmare with tableau. Some of the queries can take 10-20+ minutes to execute and then Tableau has to render.

               

              I'll try to find some folks with a 64 bit machine and get some help on that front; I appreciate the suggestion.