The major concern in using Custom SQL is when you are connecting live. Then every time you drag a field to a new location in the view, a new query is gernated and run against the live connection -- but your custom SQL had to wrapped in that query vs. Tableau knowing the metadata and being able to optimize the queries.
When performing an extract, there really won't be any performance difference between custom SQL and joining tables in Tableau when building views. The custom SQL might cause an extract to take longer to build if it is more complex than simple joins.
If you were to run your query in a query editor, how many rows does it return? How long does it take? Is it possible that your joins are turning 500k rows into much more than you expect?
When I run through the query browser, I think it does go beyond a few million rows and mysql query browser just runs out of memory for its results. So since I get too many results, I can always put a limit on the total results returned. So now I have the following questions.
If I m limiting the results to say 500k rows and use a live connection, if I have a date range quick filter, will the quick filter just work on the 500k rows which I have got through the query, or will it use the quick filter's settings and try to get me 500k results. I want to ensure that I have a maximum data set for my analysis.