My 2 cents on this:
1) Try to make your Tableau datasource schema
using Vertica tables first (joining or unioning),
use Custom SQL as a last resort (you did).
Here is a brief explanation why (on page 17 and on):
With the (joined) tables visible in the datasource
Tableau would generate a more "performant" SQL code,
which could be effectively executed by Vertica optimizer.
Besides, one could use the Tableau-generated code
(grabbed in a Performance Recording session)
as a source to the Database Designer tool to suggest projections.
2) Alternatively, you may try to execute your script (view) as Initial SQL
(creating TEMPORARY table ... ON COMMIT PRESERVE ROWS),
then pointing to this temp table as a datasource in Tableau.
Here are some concerns:
1. Does a "vertica view" differ from "a custom sql" in any way? You said that I did custom SQL in your reply but I didnot create "literally" a custom SQL.
2. Script is a very big one and I cannot avoid left joins because of business needs. I think inner joins is the essence of datasource schema; am I right?
3. Tu sum up the question; let me ask in this way: ıs there a way of filling filters before executing query? (at least date filter)
1. You're right, they are almost identical --
in a sense Tableau couldn't do much with either :-)
Besides, there is a possibility to insert Parameters
into the Custom SQL -- so it would be possible at least
to make a date range filter with two date Parameters.
2. Tableau allows for Inner / Left / Right / Full Outer joins
when building a schema in a Datasource Pane.
Join Culling would work for PK-FK Inner Joins only.
So in your particular case it would be of a little benefit (if any).
3. Let me answer this way: joins are costly in Vertica --
and it's true for any RDBMS (either analytical or general-purpose).
As for Vertica, general recommendations are:
1) Joins should be made locally (on each and every node in the cluster).
2) If the 1) is not true -- and segments redistribution couldn't be avoided --
then join materialization is needed -- either via pre-joined projection(s)
or simply by writing result set(s) to (temp or permanent) table(s).
3) If the 1) is true and 2) is not an option -- and it is the most common case --
then building additional projections (optimised for the join) would be beneficial.