5 Replies Latest reply on Sep 11, 2017 11:15 PM by Hristo Mitev

    Tableau Extract started exhausting server resources on refresh (assumption: after ODBC customisation changes)

    Hristo Mitev

      Hello,

       

      one of our extracts started exhausting server RAM after we applied some ODBC customisations to the TDS, so that it ignores the SUM queries that are ran on the extract before data processing.

      Unfortunately after we applied that change the extract started exhausting the server resources when refreshing. Despite us reducing the amount of data in the extract.

      So at the moment we assume this is because of our ODBC customisations that we did to remove the SUM queries. Unfortunately we are not exactly sure which of the settings did it (if it is at all related to that).

       

      This is the error that gets triggered on the server side:

      2017-09-07 12:39:53.072 +0100 (,,,) scheduled-background-job-runner-1 : ERROR com.tableausoftware.backgrounder.runner.BackgroundJobRunner - Error executing backgroundjob: :refresh_extracts
      java.util.concurrent.ExecutionException: com.tableausoftware.nativeapi.dll.DataSourceException: The connection to the data source might have been lost.
      ExternalProtocol::PipeMessageSource::ReadBytes: Communication with the Tableau Protocol Server process was lost.
      

       

      And this is the ODBC customisations that we use:

      <connection-customization class='genericodbc' enabled='true' version='10.1'>
            <vendor name='MySQL ODBC 5.3' />
            <driver name='MySQL ODBC 5.3 Driver' />
            <customizations>
              <customization name='CAP_CREATE_TEMP_TABLES' value='yes' />
              <customization name='CAP_ISOLATION_LEVEL_READ_COMMITTED' value='no' />
              <customization name='CAP_ISOLATION_LEVEL_READ_UNCOMMITTED' value='no' />
              <customization name='CAP_ISOLATION_LEVEL_REPEATABLE_READS' value='no' />
              <customization name='CAP_ISOLATION_LEVEL_SERIALIZABLE' value='no' />
              <customization name='CAP_ODBC_BIND_DETECT_ALIAS_CASE_FOLDING' value='no' />
              <customization name='CAP_ODBC_BIND_FORCE_DATETIME_AS_CHAR' value='no' />
              <customization name='CAP_ODBC_BIND_FORCE_DATE_AS_CHAR' value='no' />
              <customization name='CAP_ODBC_BIND_FORCE_MAX_STRING_BUFFERS' value='no' />
              <customization name='CAP_ODBC_BIND_FORCE_MEDIUM_STRING_BUFFERS' value='no' />
              <customization name='CAP_ODBC_BIND_FORCE_SIGNED' value='no' />
              <customization name='CAP_ODBC_BIND_FORCE_SMALL_STRING_BUFFERS' value='no' />
              <customization name='CAP_ODBC_BIND_SUPPRESS_INT64' value='no' />
              <customization name='CAP_ODBC_BIND_SUPPRESS_PREFERRED_TYPES' value='no' />
              <customization name='CAP_ODBC_BIND_SUPPRESS_WIDE_CHAR' value='no' />
              <customization name='CAP_ODBC_CURSOR_DYNAMIC' value='no' />
              <customization name='CAP_ODBC_CURSOR_FORWARD_ONLY' value='no' />
              <customization name='CAP_ODBC_CURSOR_KEYSET_DRIVEN' value='no' />
              <customization name='CAP_ODBC_CURSOR_STATIC' value='no' />
              <customization name='CAP_ODBC_ERROR_IGNORE_FALSE_ALARM' value='no' />
              <customization name='CAP_ODBC_FETCH_BUFFERS_RESIZABLE' value='no' />
              <customization name='CAP_ODBC_FETCH_BUFFERS_SIZE_MASSIVE' value='no' />
              <customization name='CAP_ODBC_FETCH_CONTINUE_ON_ERROR' value='no' />
              <customization name='CAP_ODBC_FETCH_ESTIMATE_ROW_COUNT' value='no' />
              <customization name='CAP_ODBC_METADATA_STRING_LENGTH_UNKNOWN' value='no' />
              <customization name='CAP_ODBC_METADATA_SUPPRESS_EXECUTED_QUERY' value='no' />
              <customization name='CAP_ODBC_METADATA_SUPPRESS_PREPARED_QUERY' value='no' />
              <customization name='CAP_ODBC_METADATA_SUPPRESS_SELECT_STAR' value='no' />
              <customization name='CAP_ODBC_METADATA_SUPPRESS_SQLCOLUMNS_API' value='no' />
              <customization name='CAP_ODBC_METADATA_SUPPRESS_SQLFOREIGNKEYS_API' value='no' />
              <customization name='CAP_ODBC_METADATA_SUPPRESS_SQLPRIMARYKEYS_API' value='no' />
              <customization name='CAP_ODBC_METADATA_SUPPRESS_SQLSTATISTICS_API' value='no' />
              <customization name='CAP_ODBC_REBIND_SKIP_UNBIND' value='no' />
              <customization name='CAP_ODBC_TRIM_VARCHAR_PADDING' value='no' />
              <customization name='CAP_ODBC_UNBIND_AUTO' value='no' />
              <customization name='CAP_ODBC_UNBIND_BATCH' value='no' />
              <customization name='CAP_ODBC_UNBIND_EACH' value='no' />
              <customization name='CAP_QUERY_BOOLEXPR_TO_INTEXPR' value='yes' />
              <customization name='CAP_QUERY_FROM_REQUIRES_ALIAS' value='no' />
              <customization name='CAP_QUERY_GROUP_BY_ALIAS' value='no' />
              <customization name='CAP_QUERY_GROUP_BY_DEGREE' value='no' />
              <customization name='CAP_QUERY_HAVING_REQUIRES_GROUP_BY' value='no' />
              <customization name='CAP_QUERY_HAVING_UNSUPPORTED' value='no' />
              <customization name='CAP_QUERY_JOIN_ACROSS_SCHEMAS' value='no' />
              <customization name='CAP_QUERY_JOIN_REQUIRES_SCOPE' value='no' />
              <customization name='CAP_QUERY_NULL_REQUIRES_CAST' value='no' />
              <customization name='CAP_QUERY_SELECT_ALIASES_SORTED' value='yes' />
              <customization name='CAP_QUERY_SORT_BY_DEGREE' value='yes' />
              <customization name='CAP_QUERY_SUBQUERIES' value='no' />
              <customization name='CAP_QUERY_SUBQUERIES_WITH_TOP' value='yes' />
              <customization name='CAP_QUERY_SUBQUERY_QUERY_CONTEXT' value='no' />
              <customization name='CAP_QUERY_TOPSTYLE_LIMIT' value='no' />
              <customization name='CAP_QUERY_TOPSTYLE_ROWNUM' value='no' />
              <customization name='CAP_QUERY_TOPSTYLE_TOP' value='no' />
              <customization name='CAP_QUERY_TOP_0_METADATA' value='no' />
              <customization name='CAP_QUERY_TOP_N' value='yes' />
              <customization name='CAP_QUERY_WHERE_FALSE_METADATA' value='no' />
              <customization name='CAP_SELECT_INTO' value='yes' />
              <customization name='CAP_SELECT_TOP_INTO' value='yes' />
              <customization name='CAP_SET_ISOLATION_LEVEL_VIA_ODBC_API' value='no' />
              <customization name='CAP_SET_ISOLATION_LEVEL_VIA_SQL' value='no' />
              <customization name='CAP_SUPPRESS_CONNECTION_POOLING' value='no' />
              <customization name='CAP_SUPPRESS_DISCOVERY_QUERIES' value='no' />
              <customization name='CAP_SKIP_CONNECT_VALIDATION' value='no' />
              <customization name='SQL_AGGREGATE_FUNCTIONS' value='127' />
              <customization name='SQL_CATALOG_NAME_SEPARATOR' value='.' />
              <customization name='SQL_CATALOG_TERM' value='database' />
              <customization name='SQL_CATALOG_USAGE' value='29' />
              <customization name='SQL_COLUMN_ALIAS' value='Y' />
              <customization name='SQL_CONVERT_FUNCTIONS' value='0' />
              <customization name='SQL_CURSOR_COMMIT_BEHAVIOR' value='2' />
              <customization name='SQL_DATETIME_LITERALS' value='7' />
              <customization name='SQL_DBMS_NAME' value='MySQL' />
              <customization name='SQL_DBMS_VER' value='5.6.30-76.3-56-log' />
              <customization name='SQL_DRIVER_ODBC_VER' value='03.51' />
              <customization name='SQL_DRIVER_VER' value='03.51.30' />
              <customization name='SQL_IDENTIFIER_QUOTE_CHAR' value='`' />
              <customization name='SQL_MAX_IDENTIFIER_LEN' value='192' />
              <customization name='SQL_NUMERIC_FUNCTIONS' value='16777215' />
              <customization name='SQL_ODBC_INTERFACE_CONFORMANCE' value='2' />
              <customization name='SQL_ODBC_VER' value='03.80.0000' />
              <customization name='SQL_OJ_CAPABILITIES' value='123' />
              <customization name='SQL_QUOTED_IDENTIFIER_CASE' value='3' />
              <customization name='SQL_SCHEMA_TERM' value='' />
              <customization name='SQL_SCHEMA_USAGE' value='0' />
              <customization name='SQL_SPECIAL_CHARACTERS' value='ÇüéâäàåçêëèïîìÄÅÉæÆôöòûùÿÖÜáíóúñÑ' />
              <customization name='SQL_SQL92_DATETIME_FUNCTIONS' value='7' />
              <customization name='SQL_SQL92_NUMERIC_VALUE_FUNCTIONS' value='0' />
              <customization name='SQL_SQL92_PREDICATES' value='0' />
              <customization name='SQL_SQL92_RELATIONAL_JOIN_OPERATORS' value='466' />
              <customization name='SQL_SQL92_STRING_FUNCTIONS' value='255' />
              <customization name='SQL_SQL92_VALUE_EXPRESSIONS' value='0' />
              <customization name='SQL_SQL_CONFORMANCE' value='4' />
              <customization name='SQL_STRING_FUNCTIONS' value='491519' />
              <customization name='SQL_SYSTEM_FUNCTIONS' value='7' />
              <customization name='SQL_TABLE_TERM' value='table' />
              <customization name='SQL_TIMEDATE_ADD_INTERVALS' value='0' />
              <customization name='SQL_TIMEDATE_DIFF_INTERVALS' value='0' />
              <customization name='SQL_TIMEDATE_FUNCTIONS' value='106495' />
              <customization name='SQL_TXN_CAPABLE' value='3' />
            </customizations>
          </connection-customization>
      

       

       

      So my questions are:

      Did anyone else experienced similar issues?

      Does anyone have thoughts on the potential root of the issue, as it may not be what we suspect ?

        • 1. Re: Tableau Extract started exhausting server resources on refresh (assumption: after ODBC customisation changes)
          Jeff Strauss

          I haven't seen such similar issues as typically our backgrounders take up a lot of CPU time, but not so many RAM resources.  Here are a few suggestions:

           

          1. Can you remove all customizations and see if the problem still occurs?  And then gradually add the customizations back in

           

          2. Based on your error "Communication with the Tableau Protocol Server process was lost.  " have  a look at the log called tabprotosrv which is the primary communication between Tableau and your database

          • 2. Re: Tableau Extract started exhausting server resources on refresh (assumption: after ODBC customisation changes)
            Hristo Mitev

            Thank you for the suggestion. I did find something else interesting in the logs:

             

            {"ts":"2017-09-07T12:30:20.419","pid":16348,"tid":"3740","sev":"info","req":"-","sess":"-","site":"-","user":"-","k":"msg","v":"Resource Manager: Memory info: 3769966592 bytes (current process); 13604311040 bytes (Tableau total)"}
            {"ts":"2017-09-07T12:39:32.760","pid":16348,"tid":"3740","sev":"info","req":"-","sess":"-","site":"-","user":"-","k":"msg","v":"Resource Manager: Exceeded allowed memory usage across all processes. 17193902080 bytes"}
            {"ts":"2017-09-07T12:39:32.761","pid":16348,"tid":"3740","sev":"fatal","req":"-","sess":"-","site":"-","user":"-","k":"msg","v":"Resource Manager: Process is above Memory Threshold. Requesting termination."}
            {"ts":"2017-09-07T12:39:32.767","pid":16348,"tid":"3740","sev":"fatal","req":"-","sess":"-","site":"-","user":"-","k":"msg","v":"\n\n*** Terminate handler called.\n"}
            {"ts":"2017-09-07T12:39:34.292","pid":16348,"tid":"3740","sev":"fatal","req":"-","sess":"-","site":"-","user":"-","k":"msg","v":"\n\n*** Unhandled Exception: 0xe0000000\n"}
            {"ts":"2017-09-07T12:39:34.416","pid":16348,"tid":"3740","sev":"fatal","req":"-","sess":"-","site":"-","user":"-","k":"msg","v":"\n*** In DoMiniDumpWriteDump: Found function 'MiniDumpWriteDump'.\n"}
            {"ts":"2017-09-07T12:39:34.416","pid":16348,"tid":"3740","sev":"fatal","req":"-","sess":"-","site":"-","user":"-","k":"msg","v":"\n*** In DoMiniDumpWriteDump: Found mini-dump thread handle.\n"}
            {"ts":"2017-09-07T12:39:34.416","pid":16348,"tid":"3740","sev":"fatal","req":"-","sess":"-","site":"-","user":"-","k":"msg","v":"\n*** In DoMiniDumpWriteDump: Signaling mini dump thread to create dump.\n"}
            

             

            It seems the process is indeed running out of memory. The server has 8G of RAM. So it seems its using some swap as i can see a usage of 17G being reported as the limit.

            Ill have to check whether i can increase that threshold, if possible without increasing the actual RAM (as its quite difficult to do that in our current infrastructure).

            Interestingly enough previously this extract didnt reach that limit, so ill also try to change some of the customisation options which may be responsible for this.

             

            In the meantime if you have any other ideas please do share.

            • 3. Re: Tableau Extract started exhausting server resources on refresh (assumption: after ODBC customisation changes)
              Toby Erkson

              As a general rule of thumb you should have 8GB RAM per core (CPU).  Also, running 8GB for a server  is ridiculously low in the first place!  Make sure one of your "to do" tasks is to get working on getting the necessary RAM installed ASAP.  You should make sure you have a dedicated server for your Tableau Server environment, do not allow it to be shared with other applications.

              • 4. Re: Tableau Extract started exhausting server resources on refresh (assumption: after ODBC customisation changes)
                Toby Erkson

                For the customizations, only put those in place that you are changing.

                 

                Also make sure that the version matches the current version of your Tableau Server

                <connection-customization class='genericodbc' enabled='true' version='10.1'>

                 

                • 5. Re: Tableau Extract started exhausting server resources on refresh (assumption: after ODBC customisation changes)
                  Hristo Mitev

                  I completely agree with your statement, unfortunately the current architecture that we use is pretty limited and we just started fully adopting tableau, so we are still on pretty early adoption stages.

                  Yes, the server is dedicated to tableau only.

                   

                  Thank you for your suggestions!