5 Replies Latest reply on Nov 22, 2010 5:42 PM by Robert Morton

    Errors creating extract in 6.0

    . Otoold

      I've been experiencing performance issues since upgrading server and desktop to 6.0. We typically connect directly to a sql server for creating new workbooks and publishing. Not sure if this is supposed to have the 100x performance improvements too?


      I have since tried creating a data extract to see if that helps. I am receiving the following errors. Has anyone seen these before?


      SQL Server database error 0x80004005: protocol error in TDS stream

      SQL Server database error 0x80004005: Communication link failure

      SQL Server database error 0x80004005: tcp provider: an existing connection was forcibly closed by the remote host

      unable to create extract

        • 1. Re: Errors creating extract in 6.0
          Robert Morton

          Hi otoold,


          Some web searches turn up the following Microsoft KnowledgeBase article:  http://support.microsoft.com/kb/945977


          Can you check your NIC settings and see if the recommended changes fix your problem?




          • 2. Re: Errors creating extract in 6.0
            . laurinl

            I also have run into an issue when attempting to create an extract in 6.0 after experiencing performance issues with queries in 6.0.  It looks like the extract creation is erring out after about 40 million records. 


            I attended a 6.0 Preview event and I thought the DB used in the demonstration was much larger than this, so I am a little confused why I am getting the error message below.  Has anyone seen this one before?


            Tableau Data Engine Error: 4: Not enough quota is available to process this command.

            - (CreateFileMapping: offset=284098560 limit=65536)

            Unable to create extract

            • 3. Re: Errors creating extract in 6.0
              Robert Morton

              Hi laurinl,


              Do you have sufficient space on your disk drive that holds your Temp directory?  This is typically on your C: drive.  For 40+ Million records -- depending on the number of columns -- this could easily take several tens of GB of temporary space.  This is especially true if any of your columns are large VARCHAR columns, which many databases use to represent large documents such as images, XML, etc. stored directly in the database.  You may wish to hide the columns you aren't interested in so that the extract process only pulls down data that you need.


              Additionally you mentioned problems with query performance in 6.0 against your database.  Did you experience better performance in a prior version of Tableau?  What kind of database are you using, and how are you connecting (i.e., single table, multi-table or custom SQL)?


              Does this help?


              • 4. Re: Errors creating extract in 6.0
                . laurinl

                Thanks for the quick response Robert!


                I have over 200 GB of free space on my C: drive.  I already hid those columns we do not need (and only pulled the tables we do need for our analysis). 


                I have a multi-table connection to an Oracle 9 DB.  I have not used this connection from a prior version.  The comment about the query performance issues has to do with the performance when Tableau is attempting to build a visualization when connected to the live DB.


                I did have one table with LONG data type. I just removed this table from the extraction and am attempting again.  Your comment on the VARCHAR data types got me thinking about this option. It seems a little bit of a reach, so any other thougths are welcome.


                Thanks again!

                • 5. Re: Errors creating extract in 6.0
                  Robert Morton

                  Hi laurinl,


                  Another thing you may wish to consider is to establish a filter criteria when creating your extract.  The extract dialog gives you options for filtering and aggregation, and if you use a restrictive filter (e.g. for recent events) then you may be able to get an idea of the size requirements for this extract in proportion to the complete volume of data you have.  Given this information, is 200 GB of space still enough for the amount of data you are extracting?


                  If you are still unable to complete the extract after trying other options, I encourage you to contact our very helpful support staff so they can work with you to collect logs and understand the underlying problem.