2 Replies Latest reply on Oct 14, 2019 6:19 AM by michael lomas

    Databricks Connector

    michael lomas

      I'm using the new Databricks connector in 2019.3 to connect to our Delta Lake. I noticed when I went to publish a dashboard to our Tableau server there was some text at the bottom of the publishing window stating that my data source required extract creation. I have 'Live Connection' selected in the data source, so this isn't something I'd expect. Does anyone know if this is specific to the connector or whether there might be some other forces at play here? Our data sources are way too big for extracts, and extract usage doesn't allow us to take advantage of the distributed compute available in Databricks/Spark.

        • 1. Re: Databricks Connector
          Heyward Drummond

          The driver was written and is supported by the vendor. On the Tableau drivers page it points to how to get the driver and a whole page of information on using it located here: Tableau — Databricks Documentation


          I have not seen where they mention needing to use an extract but they control that. You could file a Support ticket with DAtabricks and ask them about this issue To see if the driver is going to be changed to accommodate your needs.


          Tableau relies on the technology vendors who write most of the drivers using their APIs and ours.


          Hope that sheds some light.

          • 2. Re: Databricks Connector
            michael lomas

            Thank you Heyward! I will follow up with our support partners at Databricks. Seems sort of pointless to force an extract creation in this instance, so hopefully it's something on my end.