2 Replies Latest reply on Jun 18, 2014 12:55 PM by Russell Christopher

    Handling large datasets hosted on Oracle server

    John Clair

      I’m using Tableau to build out a dashboard for data which is stored on an Oracle Server. The issue is that some of the data I need is large, 100+ million rows, and this seems to be too large for Tableau to handle. Is there a setting or method for improving performance when dealing with large datasets?  Thanks!

        • 1. Re: Handling large datasets hosted on Oracle server
          vikram bandarupalli


          Here are my thoughts,


          -Technically tableau can handle 100+ million rows of data. The question is do you really need that kind of data in the first place. Regardless, extracts are the best way in such a situation. Remember, the extract size will be huge and might take lot of time to create, however, once you create the extract and load it on the server you will be able to analyze quickly.

          -Make sure you have enough disk space, memory on the server. I don't think it's a good idea to create such a large extract on the local machine. However, create a small extract and run the full extract on the server.

          - There are definitely ways to optimize the size of extract by using Context filters, hiding unused fields, optimizing etc.


          -You can find more details here,

          Tips for Working with Extracts | Tableau Software


          Hope this helps.



          • 2. Re: Handling large datasets hosted on Oracle server
            Russell Christopher

            John, where does the data live? Are you querying it directly in Oracle, or is the data local, in a Tableau Data Extract?


            Vikram seems to be assuming that you are using a TDE, but if you aren't your problem is most likely Oracle itself. You might consider using an extract instead of a "direct connection" to Oracle if Oracle can't answer your questions quickly enough.