3 Replies Latest reply on Feb 24, 2012 2:44 PM by Robert Morton

    repeated out of memory errors with ODBC connection to large service provider database



      I am connected to a large 3rd party data provider (1010data) as a brief trial in hopes of analyzing their FNMA agency mortgage database using Tableau.  I connect via ODBC/System DSN with SAN.  Here are the connection details.


      DSN: www2.1010data.com

      Host: https://www2.1010data.com (although I specific my proxy/port before this string)

      User ID: capgroup_tableau_1

      Password: cap217

      Port: 443


      SAN Pool ID Group Name: capgroup_tableau


      When opening the .twb hit the X in the top right corner when prompted to create an extract.  This is a 100MM+ record database that takes up 20 gigs of space.


      The workbook as it is attached loads.  But try to drag a field such as PCCOUP into the view or create a group off of it.  I get an out of memory error message saying "Unable to properly calculate the domain for the field 'pccoup'. Displayed data may be incorrect. "  This is true of any additional manipulation of the data I attempt.  The dataset is already limited to only a single month's worth of data (it goes back nearly 20 yrs)


      Is there anything I can do about this or is Tableau not capable of working with this dataset give my local computing power?  I am running Tableau 7.0 using XP on a new Intel Core i5-2500 CPU @ 3.3Ghz with 3.16 GB of RAM.  200GB drive with 190 GB free.  Is there an ODBC configuration change that would help?

        • 1. Re: repeated out of memory errors with ODBC connection to large service provider database
          Robert Morton

          Hi Dbetanzos,


          This sounds like a problem with the ODBC driver misreporting how much memory is required for the records it intends to return. There are ways to address problems like this which I can guide you through. However first I would like you to contact support so they may work with you to collect log files, which will be necessary to determine the root cause of this problem. Please mention that Robert Morton is interested in taking a look at the logs, and I hope to get back to you soon.


          In the meantime I'll try to reproduce the problem locally with the information you have provided.



          • 3. Re: repeated out of memory errors with ODBC connection to large service provider database
            Robert Morton



            Please remove these attachments from the forum -- it's best to only share those files directly with Support.


            From a cursory inspection it appears that even your aggregate queries are returning nearly 10 million records, which is very suspicious. Consider examining your ODBC configuration for 1010data to determine why the aggregate queries are yielding such fine-grained results.


            Last, I noticed that certain fields like 'pccoup' appear as floating-point values, but you have elected to treat them as discrete dimensions. Since floating-point fields often contain a large number of unique values, attempting to use them as a dimension can lead to very large resultsets. This may explain why your aggregate queries are failing to group results into coarse-grained buckets. If these fields are not numeric in nature (even if they are reported as such by the driver), you may need to consider creating calculated fields which cast those to INTEGER or STRING, and use only the calculated fields in your visualizations.