I would keep the aggregated dataset in Oracle database, in a separate tablespace, and connect live Tableau with that tables. The volume of data and the velocity of change is somehow big, and keeping both original and summaries on Oracle gives you a lot of control on how and when you could improve your aggregated dataset.
Maybe next month you will be forced to aggregate at 4 hours window or 8 hours or to drop some columns because your preliminary analysis tell that are redundant from business logic perspective.To create the new dataset is very easy having things under control.
Thanks. It turns out that our dataset has a ton of superfluous fields. After we hid the unused fields and re-ran the extract, it reduced the size by nearly half! Performance is now much better. Hopefully we won't have to go down the route of aggregating to 4-hour periods.