8 Replies Latest reply on Sep 24, 2018 6:24 AM by Toby Erkson

    Configuring the underlying Postgres database

    Nobusu Hide

      I'm trying to set some configuration values in the underlying PostgreSQL database, but it appears that tableau overwrites it everytime I restart the server using `tsm`.

       

      Is there a way of making the changes I make to the postgresql.conf file persist across tableau server restarts? The values I'm interested in changing have to do with the `shared_buffers` settings, which is set to 16MB when the general advice from the PostgreSQL community is to use 25% of system memory as a reasonable starting point.

        • 1. Re: Configuring the underlying Postgres database
          Toby Erkson

          The Tableau Server db?  Why, to what end?

          • 2. Re: Configuring the underlying Postgres database
            Jeff Strauss

            Hi Nobusu.  It's best practice to not custom configure the system memory for the database as generally speaking, the Tableau internal database isn't ever a bottleneck.  The recommendation for 25% of system memory for hosting Postgres is a very general statement, and most likely applies more so to databases that have heavy query loads such as data warehouses / predictive mining / etc.  In the case of Tableau, all queries are quite light against the db as it's more of an app database, and the default set amount memory is enough.

            • 3. Re: Configuring the underlying Postgres database
              Nobusu Hide

              I suppose it depends on the dataset size you're working with. The `shared_buffers` parameter is used for practically everything in Postgres (joins, subqueries, indexes, literally everything). I would like to attempt modifying it and I take full responsibility for whatever bad thing might happen. It's a test server .

               

              The reason I'm interested in this is because we implement filters for datasets. This will undoubtedly hit indexed columns and with such a small amount of memory allocated to shared_buffers, it means that the majority of indexes will not be in memory. This in turn will be bad for performance.

               

              As I have a server with a lot of memory, and a lot of it is sitting free I would like to investigate the performance impact of bumping up shared_buffers.

              • 4. Re: Configuring the underlying Postgres database
                Michael Gillespie

                But Nobu, the data you're working with isn't stored in the Postgres database.  That database exists to support Tableau's own needs, not your data.

                 

                Any filters you implement on your own data have nothing to do with Tableau's own Postgres instance.

                • 5. Re: Configuring the underlying Postgres database
                  Nobusu Hide

                  I'm happy to be corrected, but I think that's not correct when Tableau is configured to use a BigQuery connector in the temp table (spool) approach.

                   

                  We have a lot of data in Google BigQuery, and we have Tableau configured to save the rows from BigQuery to a temp table. It is my understanding that this temp table lives in the Postgres database on the Tableau server. It's (almost) certainly not in BigQuery as I do not see any extra charges from Google when churning over the data in the temp table.

                   

                  I'm happy to be corrected by more knowledgable persons, but my assumption is that the temp table is in the Postgres database. If that is correct, then the 16MB `shared_buffers` setting is going to be inadequate to deal with the datasets I'm working with.

                  • 6. Re: Configuring the underlying Postgres database
                    Nobusu Hide

                    Additionally I should say that we have an daily Extract scheduled, that would download the data into Tableu and our workbooks query the extract. This must mean that the data is in the Postgres database. Right?

                    • 7. Re: Configuring the underlying Postgres database
                      Michael Gillespie

                      Well, no actually.  My understanding of the architecture is that the Hyper in-memory engine handles all of the interaction with source data, not the Postgres database

                       

                      From the Tableau docs: "If the result set is too large, the BigQuery API returns an error and the Tableau BigQuery connector retries the query by saving the results into a BigQuery temp table. The BigQuery connector then reads from that temp table, which is a spool job that uses the bq-large-fetch-rows setting."  - Note the language: it puts the results in a BigQuery temp table, not a Tableau temp table.

                       

                      Extracts do not use Postgres, only Hyper (or the TDE engine in older versions).

                       

                      I would be thrilled for a Tableau employee to weigh in on this thread.  I see a lot of confusion about what the Postgres database is used for and having a KB article or at least a forum discussion that lays out the facts once and for all would be extremely helpful.

                      • 8. Re: Configuring the underlying Postgres database
                        Toby Erkson

                        I think it would be best to contact Tableau Support about this.  Also, with a fully-running (i.e. production) TS, memory is being used for various caching activities, extract and subscription processing, etc. so there typically isn't a huge chunk being unused -- at least in my environment.

                         

                        It would be interesting to see if this is possible and how it affects things but, again, Tableau Support is your best best in getting any real answers.