Your "best" option is subjective.
The best option in my opinion is to setup an ETL process to load into a database and connect Tableau directly to the transformed database.
This will require you to implement and configure ETL jobs for each source either in an existing platform (i.e. Alteryx, SSIS) or custom build your own using a scripting tool you're familiar with (i.e. C#/.NET, Python).
It sounds like either of those approaches is going to be outside your technical abilities so, the "best" option is to either hire/find someone to do the above or build something with the tools you're already comfortable with and can use. You're probably not going to get everything automated exactly the way you want it, but that's a sacrifice you'll have to make.
I do agree that bringing everything into one source will make your life easier in some ways, but that would largely depend on how your sources within Tableau are structured, whether or not you can easily join through Tableau etc.
Do take a look at Alteryx though - they target people with use cases like yours so it could be a good option.
Thanks for the response!
Of course a proper ETL process would be ideal, but in the face of a swamped engineering team and my rusty python skills I'm looking for the next best thing. I looked into Alteryx, but it appears to only run on PC.
For now It looks to be manual dumps into a google sheet,
Unless..........[insert next reply]
I wish I had an unless, but unfortunately I don't. At the end of the day you will need some sort of scripting somewhere to make this more automated.
If you're interested in learning that, I would suggest you pick one of your sources and investigate how you can pull data from that source via an api, scheduled download, csv export etc and learn how to execute on that through your chosen scripting language i.e. Python.
As an example, start simple. Learn how to pull your dataset from Salesforce via their API into a CSV which you store on your computer. Once you've got that figured out (which will not be an easy feat coming from nothing by the way), then figure out how to use Python to pull the data from the API and insert into a database or google sheet etc.
Benjamin, Are you using something like Google BigQuery or Amazon Athena, Redshift or Redshift Spectrum? There are some options if you are. For example, I have a post that describes how to export tracking data from Salesforce Marketing Cloud: https://blog.openbridge.com/export-tracking-data-from-salesforce-marketing-cloud-8a0a4c1f37dc Also, if you are looking to create these "data pipelines" from various data sources I have another post which dives into the topic: https://blog.openbridge.com/the-new-openbridge-data-pipeline-marketplace-say-goodbye-to-data-that-is-messy-complicated-scary-c0c2fecf243b There are also some pretty interesting things you can do using Zapier plus webhooks to route data from 100s of data sources to a system like Google BigQuery. The same is true for streaming real-time data from any system you have deployed Google Tag Manager. Happy to discuss further!