1 Reply Latest reply on Jun 6, 2017 1:00 PM by Jeff Strauss

    40 Million Rows

    jyoti thakar

      What do you guys recommend , Extract or Live connection when accessing a table with 40 million rows with 100 columns

      What would be some of the handson practices that you guys implement when dealing with such data sets?

       

      We are having dashboards taking more than 25-30 seconds to load dashboards on a Live connection

       

        • 1. Re: 40 Million Rows
          Jeff Strauss

          The answer is that it usually depends on a set of attributes.

          - 40 million rows with 100 columns isn't excessive if it's done right

          - what is your backend database for the dashboard that take 25 to 30 seconds?

          - have you checked to see where the bottleneck in performance may be?  If you append ?:record_performance=yes onto the URL making the request and you see that the green bars (queries) are the majority of the rendering time, then you need to make an adjustment to the way that you are querying.  Create a Performance Recording

          - do you have fast disk throughput available on your TS?  If so, then extract could be an option