Given a column that has a hive type something like
array<struct<key:type, key2:type2...>>
a simple
SELECT {ArrayColumn} from {TheTable} gives a numpy array error (see below)
The query hangs in SQLLab at "X%" complete, and never returns the error (the backtrace)
This is done through the celery async worker. The worker backtraces, but superset never picks that up.
So two things
superset-0.17.1rc2-py3.5
Happy query
Traceback (most recent call last):
File "/usr/local/lib/python3.5/dist-packages/celery-3.1.23-py3.5.egg/celery/app/trace.py", line 240, in trace_task
R = retval = fun(*args, **kwargs)
File "/usr/local/lib/python3.5/dist-packages/celery-3.1.23-py3.5.egg/celery/app/trace.py", line 438, in __protected_call__
return self.run(*args, **kwargs)
File "/usr/local/lib/python3.5/dist-packages/superset-0.17.1rc2-py3.5.egg/superset/sql_lab.py", line 136, in get_sql_results
df_data = np.array(data) if data else []
ValueError: setting an array element with a sequence
Notice: this issue has been closed because it has been inactive for 404 days. Feel free to comment and request for this issue to be reopened.
please reopen this issue. Array type column is still creating problem in apache superset.
Most helpful comment
please reopen this issue. Array type column is still creating problem in apache superset.