i am using extractAPI to generate a hyper file out of a 14m x 89 table on RHEL7
I have 1TB of space on the host, however generating hyper for this big file results
in this exception:
self.set_column_values( new_table, table_def, add_index )
File "xx_.py", line 140, in set_column_values
tableau_table.insert( new_row )
File "/usr/lib/python2.7/site-packages/tableausdk/HyperExtract.py", line 545, in insert
raise Exceptions.TableauException(result, Exceptions.GetLastErrorMessage())
tableausdk.Exceptions.TableauException: TableauException (20200): ERROR: unable to buffer copy-in data
When i google it , it seems that this is due to no space on the disk, however i have 1tb of space, so this is making me think that either
the problem is with the data (a column is too long) or somehow extractAPI is trying to store logs in some other directories.....
could anyone assist?
Become a Viz Whiz on the Forums!
Support the Community and master Tableau.
Retrieving data ...