2 Replies Latest reply on Sep 2, 2016 4:47 PM by Yuriy Fal

    Tableau - Hive - Spark

    Venu Tummala

      Tableau client can connect to Hive MetaStore tables in Hadoop using Hive or Impala drivers. When it connects using Hive drivers through HiveServer2.. for every interaction in Tableau a corresponding MapReduce job runs in Hadoop.I want to use Spark instead of MapReduce when I connect to Hadoop from Tableau.

       

      After the development of HIve-on-Spark(HOS) the hive queries/scripts can now optionally use a MapReduce engine or a Spark engine.

      My question is .. from Tableau client if we use Hive driver and connect through HiveServer2 is there a way to force the Hadoop engine to be Spark instead of MapReduce?

       

      Thanks

      Venu