Do you get the same problem with smaller data sets using another server side TDE? Also might sound a bit cliche, but have you tried restarting Tableau Server and seeing if you get the same issue?
Are you actually running your own server or are you on Tableau Online? I've had the same issue on Tableau Online version 10.1.0, and running Desktop v10.1.3.
I get exactly this behavior when trying to create a local extract of a published Tableau data source (with ~3M rows). Tableau Desktop 10.1.4 and Tableau Server 10.1.3. It works with a smaller data source.
2 of 2 people found this helpful
I get the same behavior in terms of the progress dialogue repeatedly showing 8MB transferred, except it does not loop infinitely: my workbook does complete the refresh if i just let it run
Posting here because it took me a long (aggravating) time to figure out that if i just let it continue to "loop" it would eventually finish the refresh
Jason, Martin, I'd be interested to know if you initiate the refresh and let it run for an extended period if you get the same result
For comparison my datasource is a tde on a server hosted on my company's local network, ~4mil records, which took about 5 minutes to complete the refresh (just experienced this about 20 minutes ago). Of course your mileage may vary on the refresh duration depending on transfer speeds
Some additional hazy memories (don't hold me to these)...
I recall a colleague experiencing the same behavior: refresh appears to loop but did finish when they let run
Also, I'm fairly positive I've experienced this same phenomenon with server/desktop v9
Data source is a tde on a server hosted on local network, ~4mil records
I thought I had tried to let it run for long enough last time, but apparently not! I tried again today with a ~2M row data source over company network and it finished after 10-15 minutes. This is good!
So it seems it downloads chunks of 8 MB and the only problem is then that as a user I am not interested in seeing progress for each 8 MB chunk, but rather for the overall process.
5 of 5 people found this helpful
I emailed support and got this response:
Your case #02725379 has been updated. The new comment is:
Researching the unexpected behavior I believe I have an explanation of what is being experienced. Starting in Tableau Desktop and Tableau Server 10.0 there was a change made to the way data is transferred to help keep the POST requests to a reasonable size. The behavior may seem un-performant, but there should be no impact to the performance of the connection and currently there is no way to modify the size of the transfer, past the 8.0 Mb limit.
If you would like, I can submit an enhance request to make the transfer size modifiable either from the Tableau Desktop or Tableau Server end.
I believe the above information will resolve this issue. If the information provided does not resolve the issue, please let me know by responding with a few details regarding why the solution above did not resolve the issue.
I encountered this issue today - on a fairly small data set.
It happened both when first creating an extract on a published .tde data source, as well as when refreshing the extract
Data source only contains 296,294 rows, and shows up as only taking up 16MB on server.
Initially, I thought it was an endless loop and was cancelling it. Then I came across this thread and let it go. It eventually completed - but it took a while.
At one point I counted the progress bar filling up 20 times... 20 x 8MB for 16MB of data - Doesn't seem to add up....
Desktop version 10.1.3
Server version 10.1.3
Having the same issue, I was with a prospect doing this test, and the experience was poor.
Trying to connect to a published extract of 24MB, waiting for more than an hour to do the extract. Tableau Desktop never finished it.
Tableau 10.3, Tableau Online
Same thing happened to me this week. I have a large set ~38MB in 1.8 million rows, and the loop went for a few hours before completing. h
I have just now experienced the 8MB loop that goes for several minutes every time a touch a deaggreated view in my workbook. I only have 4000 marks, with a 12 attributes at LOD. No way should this be 25 x 8MB of data. ***?
This 'feature' has made authoring using Tableau DataSources impossible. Now being forced to use TDEs connected to Database. Lift your game Tableau! - this feature? obviously wasn't tested very well.
So sad, Tableau Desktop 64 Bit 10.3.1 did not solve the problem. If you use Tableau 9.x, the extract creation slow, but you see an advance in the creation. However, in Tableau Desktop 10.x, extract creation sometimes is not achieved. So sad.
Similar behavior here. I understand the concept of breaking it into 8.4 MB chunks; however, my extract refresh performance is substantially worse when coming from the Server datasource to my desktop.
Datasource ~600k records
From Server - 40+ minutes (and counting)
From From Original Datasource (Splunk database) - 45 seconds on average
Again, the concept of 8.4 MB chunks is understandable and reasonable; however, my performance doesn't match what was described by the Tableau rep above. There is no reason why a refresh from Tableau Server should take 50x longer than going straight to the original source.
This bug makes it impractical to use the Server datasources, and makes it very difficult to edit workbooks that have already been published. Workaround is to create a local replica of every single datasource (not fun).
Tableau Server & Desktop 10.4.1
The uploading is working fine, but while downloading an extract from the server, kind of infinite 8MB Popups occur.