We have extracted multi-million observation datasets that have 80+ variables and datasets with 1500+ variables that are a few hundred thousand obs without any problems at all. So, narrow+deep and wide+shallow all work without a hitch for us.
Hope this helps.
Thanks for the reply.
I am using a trial version of Tableau desktop.I am connecting through ODBC driver. In the SAS dataset there is a field called 'claim ID' which is the primary key.Claim ID column contain around 9,000,00 unique values.When I tried to create a parameter for Claim ID there is a message coming like"Tableau is low on processing the memory" also "out of memory". Total size of the SAS dataset is 230MB.
Is there any way to solve this?Is it because of trial version?
The trial version is the full-blown version with a 14-day timer on it, so I am fairly certain that is not the problem. How much memory do you have on that machine and is it 32- or 64-bit? One of the biggest causes of memory problems is trying to display millions of data points in a view.
You will also want to make sure to create an extract from the SAS dataset. We typically see an order of magnitude decrease in file size when creating an extract. In this case, your file size should drop to approximately 23Mb. FYI, this is due to all the metadata SAS datasets carry along which gets dropped when creating an extract (indexing, sort order of dataset, labels, variable length/type/informat/format/sequence, etc.).
Here is a link to another discussion that covers many other options:
Hope this helps.
I am trying to set up connection to SAS Server to access the SAS Files. For this
I have installed SAS Share software in SAS Linux server where SAS Files reside and
installed SAS ODBC Drivers in Win 7 x64 then Created DSN to connect to SAS Server and access files,.
but for some reason, I am stuck and not able to connect to SAS Server from Tableau.
Could you please help me out on this issue?