3 Replies Latest reply on Jan 24, 2018 6:25 AM by Rahul khandelwal

    Hive tables not showing up in Search

    Adarsh Shekhar

      Hi,

      I am trying to connect Tableau Desktop 10 (mac) to  Hive (2.1.1) via Spark SQL 2.1 (on centos 7 server). I am connecting via Simba ODBC driver with Authentication = Username and Username = . It doesn't give any error but I don't see the tables which are available in Hive. After searching and choosing 'default' schema, and searching for tables, I only see default (default.default) table. However, when I use beeline on the server to connect to Spark SQL, the hive tables are visible.

       

      If I use the custom SQL feature I can successfully query the tables and use the data, but I still have no way to list the tables in Tableau.

       

      enter image description here

       

      This is what I got from Tableau Support but I don't know what am I missing in my hive configuration.

      "As a preliminary step to connecting to the Spark SQL tables, enable access to a SchemaRDD registered in a catalog that is outside of the local context.  Currently today, the Hive Metastore (“Hive context”) is the only supported service. The Spark SQL connector evolved out of the Hive connector,  thus the need for the Hive Thrift Server."

       

      Below is the content of my hive-site.xml:

      <configuration>

        <property>

          <name>javax.jdo.option.ConnectionURL</name>

          <value>jdbc:mysql://localhost:3306/metastore_db?useSSL=true&amp;verifyServerCertificate=false&amp;createDatabaseIfNotExist=true</value>

        </property>

        <property>

          <name>javax.jdo.option.ConnectionDriverName</name>

          <value>com.mysql.jdbc.Driver</value>

        </property>

        <property>

          <name>javax.jdo.option.ConnectionUserName</name>

          <value>USERNAME</value>

        </property>

        <property>

          <name>javax.jdo.option.ConnectionPassword</name>

          <value>PASSWORD</value>

        </property>

        <property>

          <name>hive.exec.dynamic.partition</name>

          <value>true</value>

        </property>

        <property>

          <name>hive.exec.dynamic.partition.mode</name>

          <value>nonstrict</value>

        </property>

        <property>

          <name>hive.exec.max.dynamic.partitions</name>

          <value>1000</value>

        </property>

        <property>

          <name>hive.exec.max.dynamic.partitions.pernode</name>

          <value>1000</value>

        </property>

        <property>

          <name>datanucleus.autoCreateSchema</name>

          <value>false</value>

        </property>

        <property>

          <name>datanucleus.fixedDatastore</name>

          <value>true</value>

        </property>

        <property>

          <name>datanucleus.autoStartMechanism</name>

          <value>SchemaTable</value>

        </property>

        <property>

            <name>hive.metastore.uris</name>

            <value>thrift://SERVERNAME:9083</value>

        </property>

      </configuration>

       

      I'd greatly appreciate any help in fixing this issue.