3 Replies Latest reply on Nov 17, 2015 10:44 PM by Alex Katona

    Updating published data source fields

    Fouad Zreik



      I am facing some limitations when trying to update published data source fields.


      This is how I currently use create and use my data source which I am not sure is best practice.


      1. Connect via Desktop to Google BigQuery via custom SQL (all my data is on Big Query)
      2. Save that connection as a data source locally
      3. Create a local extract
      4. Publish the Data Source + extract + incremental schedule to Tableau Server
      5. Use that connection as my new Data Source and create a Local Extract from Tableau Server to speed up my work on the Desktop
      6. Publish reports using the data source on the Tableau server so that the users experience good performance


      Limitations I am facing:

      1. If I want to add new calculated fields to the data source on the server, I am having to republish the data source to make it available for other users and reports. Is there a way around this ?
      2. Publishing the data source with an extract means that I have to recreate a local extract directly from Big Query before I can publish. This take a long time and is kind of redundant. I only need that full data set on the server. Is there a way to create the full extract directly on the server.
      3. I am facing a limitation on Google Big Query extracts that return more than 128 MB of data so a full refresh becomes not possible. Is there another way of doing it ?