I remember experiencing this problem a while ago. In our case it was because we were using a 32-bit version of Windows and the limit, imposed by the OS, was somewhere around 2GB. When we switched to using a 64-bit OS, the error disappeared.
A quick Google search reveals that this is a problem experienced by other applications, not just Tableau (see http://stackoverflow.com/questions/4314493/file-size-limit-for-sqlite-on-32bit-system)
1 of 1 people found this helpful
There are a few things that might help in your case if the OS is not an issue or you can't upgrade to 64 bit. If your extracts are truly that large (2GB+) you might take a look at ways of reducing the size. Some possibilities:
1. Hide Unused Fields. Using this option on the extract dialog can greatly reduce the size because any field that is hidden will not be included in the extract.
2. Aggregate Data for Visible Dimensions. This option may work for you if all of your views are aggregating to a higher level of detail than the underlying data. It can drastically reduce extract size.
3. Can You Work with a Subset of the Data? Maybe you have 15 years of history, but you really only need to do detailed analysis of the last 5 or last 3 years. Could you have one data connection that gave you a very high level aggregation of the entire history and another that pulled detail for the last 3?
4. Can You Connect Live? Tableau will work very well connecting live with all kinds of data bases. Poorly structured and improperly indexed / partitioned databases might do better with an extract -- but consider if connecting live is an option in your case.
We can connect live, but the database server just doesn't respond well in terms of performance. I've been hiding the unused fields - that is a great suggestion for others experiencing this problem.
The data is actually structured extremely well as a snowflake/star schema for data warehousing purposes. The indexes on the keys have been built out as well. Right now we only have one snapshot loaded on this database. The problem we're running into is this server [dev] doesn't have enough RAM (only 4 gigs) to perform in a timely manner. In truth, we're only taking about just under 3 million rows (but a lot of dimension columns). But running directly against the server is a nightmare with tableau. Some of the queries can take 10-20+ minutes to execute and then Tableau has to render.
I'll try to find some folks with a 64 bit machine and get some help on that front; I appreciate the suggestion.