1 Reply Latest reply on Nov 29, 2016 3:36 PM by Patrick A Van Der Hyde

    Tableau with Spark

    michael chen

      Hi,

       

      We want to integrate one of products with Spark in our software solution, but we have some question below.

      1. Based on these products with spark, is there any file size/dimension/structure limitation?
      2. Is it possible to process data and visualize it in five minutes?

       

      In our solution

      1. It generates 30G data size, including 10 csv files, per day.
      2. There are millions of piece of data in each csv file.
      3. It has about 3-6 dimension in our data (e.g. the population in each country, region, province, district, etc)
      4. Providing data every 5 minute and processing/visualizing, including bar/pie/..char, it to front-end.
      5. User will define their own specific chart, such as multi-dimension column chart, from our data.

          

      Can anyone give us some suggestion or thinking for these issues?

       

      Thanks / Michael