Sounds like you're using Tableau to pull data to then dump into another application. Bad juju. A 70 column table report? That's not a report, that's more like information mining/exploration. Anyway...
Are you using the 32-bit or 64-bit version of Desktop? How much RAM does you system have? Open the Windows Task Manager and see how much of your memory is consumed before opening Desktop, then keep watching it as you run the "report".
Yes, you can filter the report based on the user ID. You need to tell us if you're using Tableau Server and/or Desktop/Reader as there are a few ways to accomplish it.
What we are trying to do is create a tabular report where you click on one id, then display the corresponding details and again when you click on an id of the details then display the corresponding details( report - subreport)
The idea I was told is to click on one level and then drill down to 4 - 5 levels... to the detailed data itself by creating a report/dashboard and also publish the data for other to use.
I am using a 64 bit Desktop Win 7 16 GB Ram . I am running the import again and 3904 MB memory is free and so far 6 million is completed. Last time i tried it errored out at 15 million.
I am using Tableau Desktop.
Even when I tried using like a million, there is 10-5 seconds lag when i drag and drop a column/dimension in the sheet.. is there a way to work in the design mode and then run it afterwards.
I'd recommend using an extract instead of connecting live to the db. After doing that, people will probably want you to attach the workbook so they can better see what's going on.
Are you just getting columns or are you also performing logic (IF...THEN, etc.) and/or custom calculations, are you using Custom SQL to get the data, etc.? I can't help now but here's some info from Robert Morton that I kept as it's helpful to know:
You should consider using the single-table or multi-table connection interface instead of Custom SQL. It's especially important to avoid prematurely grouping and aggregating your data in Custom SQL, since that can lead to mis-computed aggregations in Tableau and can easily cause performance problems.
Remove any existing GROUP BY and ORDER BY clauses from your Custom SQL.
Furthermore, remove any WHERE clauses and replace them with normal Filters in a Tableau worksheet, or define them as Data Source Filters.
If you continue to have bad performance (since many databases have trouble with the subqueries that Custom SQL requires), then consider making a Data Engine extract.
To then refresh/update your workbook after you've made changes and want to see the results click on the Update icon:
Naturally, you can turn Auto Updates back on whenever you want.
(I apologize for the multiple posts but I'm working on a crippled IE8 browser and the forum software doesn't play well with IE8, too. The combination really cripples my dynamic creativity).