1 Reply Latest reply on May 8, 2018 10:08 AM by Nathan Mannheimer

    Tableau Server API vs Web Data Connector for JSON Data

    Eric Pickard

      Hi everybody,


      I am still relatively new to Tableau and I have recently inherited some Tableau reports that I would like to automate. I have been given some python code that currently authenticates with multiple data stores/calls APIs in our organization, aggregates data, and creates multiple JSON files containing information used for reports in Tableau. These files are hosted on separate internal urls and are updated once a day. When I download these files and then upload them to Tableau Desktop, Tableau is able to auto-detect the columns for the table. In an ideal world, we are looking for the functionally to tell Tableau the url of some json data (whether its a static file or an API) and Tableau would get the data, recognize the columns, build the table, allow us to build the visualizations/enter the correct data types for the columns, and then have Tableau be able to auto-update the report with the latest data by calling the url we provided again. (Currently I am having to update our reports by manually uploading these new json files every couple of days).


      I have looked into to the Web Data Connector (WDC), but the WDC makes you define the table schema and data type for each column of your data. I have been able to do this with one of our current json files, but this does not scale well. We do not want to have to write a new connector for each new file and it seems strange that Tableau is not using the functionality it already has for json files to auto-detect columns in the WDC as well. Our team is going to be bringing multiple new systems online this year that will contain information we would like to put in Tableau reports in addition to the reporting we already have. We would rather have one tool that could grab json from any location and auto-detect columns instead of writing and maintaining many separate WDCs for each system.


      I am also beginning to look into the Tableau Server API. Is it possible replace json data file in a workbook using this API? If so, I image I could use the existing python code to push the json files to Tableau every time it is run. The only downside is that I would have to write a separate job/logic to call every API I wanted to use to report to simply forward the response to Tableau instead of simply giving Tableau the url.


      Should we go the route of writing separate WDC for each data source or is there a better way of going things to utilize Tableau's ability to auto-detect columns? Will Tableau ever add the functionality from static json files of auto-detecting columns to the WDC?


      Thank you in advance for your help!

        • 1. Re: Tableau Server API vs Web Data Connector for JSON Data
          Nathan Mannheimer

          Hi Eric,


          Your idea for a web enabled version of the JSON connector is a good one, and I would encourage you to submit it to the ideas forum as it would fit your use-case well! Currently there is not a way to utilize the auto-schema detection of the Desktop data prep system to automatically flatten a web JSON source and then publish and append the results to server.


          I would not recommend simply swapping out the JSON files, because this could lead to problems if there is any change in the structure. There is one possible workaround to explore, which is creating an extract of the JSON files, and then extracting the data and publishing that extract to server, with the JSON file stored in a Network drive that the server has access to. As you run your python script to create new JSON files, you would need to update the code to automatically append or replace the data in the existing file in the network drive. If Tableau Server can point to that file to refresh the extract it should bring in new data, provided the schema remains the same. This is detailed in this post, but for CSV instead of JSON:


          Refreshing a Server Connection to a Flat File on a Network Drive


          I think there are two other possible routes to take here:


          1) As you mentioned, you can create web data connectors for the sources, using the WDC framework to handle authentication and the conversion of JSON to a table format. The advantage here is that you would only have to do this once, and can use the WDC to continually refresh the data. Work will also carry over from one source to another, so you wouldn't have to start from scratch each time.


          2) Use your existing python scripts to parse the data from JSON into a table and then use the Tableau server API along with the Extract API to create tableau extracts and publish them to server. This method would require you to run your python scripts on a schedule to update the data.