Here are my thoughts,
-Technically tableau can handle 100+ million rows of data. The question is do you really need that kind of data in the first place. Regardless, extracts are the best way in such a situation. Remember, the extract size will be huge and might take lot of time to create, however, once you create the extract and load it on the server you will be able to analyze quickly.
-Make sure you have enough disk space, memory on the server. I don't think it's a good idea to create such a large extract on the local machine. However, create a small extract and run the full extract on the server.
- There are definitely ways to optimize the size of extract by using Context filters, hiding unused fields, optimizing etc.
-You can find more details here,
Hope this helps.
John, where does the data live? Are you querying it directly in Oracle, or is the data local, in a Tableau Data Extract?
Vikram seems to be assuming that you are using a TDE, but if you aren't your problem is most likely Oracle itself. You might consider using an extract instead of a "direct connection" to Oracle if Oracle can't answer your questions quickly enough.