[ x] I have checked that this issue has not already been reported.
[ x] I have confirmed this bug exists on the latest version of pandas.
[x ] (optional) I have confirmed this bug exists on the master branch of pandas.
Note: Please read this guide detailing how to provide the necessary information for us to reproduce your bug.
import pandas as pd
import sqlite3
# Create empty test table in memory
conn = sqlite3.connect(':memory:')
conn.cursor().execute('CREATE TABLE test (column_1 INTEGER);')
# Run the query without chunksize, works as expected
pd.read_sql('select * from test', conn)
# Run the query with chunksize, returns generator as expected
pd.read_sql('select * from test', conn, chunksize=5)
# However, the generator is empty
list(pd.read_sql('select * from test', conn, chunksize=5))
# I would expect, that for all cases where chunksize isn't necessary,
# than the following two lines would return exactly the same
# result, but the second throws "ValueError: No objects to concatenate"
pd.read_sql('select * from test', conn)
pd.concat(pd.read_sql('select * from test', conn, chunksize=5))
In many cases, returning zero rows is an expected result, and the code should run fine on the returned DataFrame (iterating over it, getting all values in a row, etc.).
The current behaviour instead returns an empty list, with no information about for example the columns in the dataframe.
The expected output would be a list containing a single empty dataframe, with the correct column metadata. I would expect that for all queries that run fine without chunkbite being set, the following equality should hold:
pd.testing.assert_frame_equal(
pd.read_sql(query, conn),
pd.concat(pd.read_sql(query, conn, chunksize=5))
)
pd.show_versions()commit : 333db4b765f8e88c0c2392943cb7d6c6013dc6e8
python : 3.8.2.final.0
python-bits : 64
OS : Darwin
OS-release : 18.7.0
Version : Darwin Kernel Version 18.7.0: Thu Jan 23 06:52:12 PST 2020; root:xnu-4903.278.25~1/RELEASE_X86_64
machine : x86_64
processor : i386
byteorder : little
LC_ALL : None
LANG : en_GB.UTF-8
LOCALE : en_GB.UTF-8
pandas : 1.1.0.dev0+1685.g333db4b76.dirty
numpy : 1.18.4
pytz : 2020.1
dateutil : 2.8.1
pip : 20.1.1
setuptools : 46.4.0.post20200518
Cython : 0.29.19
pytest : 5.4.2
hypothesis : 5.16.0
sphinx : 3.0.4
blosc : None
feather : None
xlsxwriter : 1.2.8
lxml.etree : 4.5.1
html5lib : 1.0.1
pymysql : None
psycopg2 : None
jinja2 : 2.11.2
IPython : 7.14.0
pandas_datareader: None
bs4 : 4.9.1
bottleneck : 1.3.2
fastparquet : 0.4.0
gcsfs : None
matplotlib : 3.2.1
numexpr : 2.7.1
odfpy : None
openpyxl : 3.0.3
pandas_gbq : None
pyarrow : 0.17.1
pytables : None
pyxlsb : None
s3fs : 0.4.2
scipy : 1.4.1
sqlalchemy : 1.3.17
tables : 3.6.1
tabulate : 0.8.7
xarray : 0.15.1
xlrd : 1.2.0
xlwt : 1.3.0
numba : 0.48.0
PR raised which addresses this bug.
Hi Johan,
Is this fixed ?
I am using the following code to read a large table into a DataFrame but I am getting "Assertion Error" 20 columns passed, passed data had 0 columns" after a few iterations.
Here is the code
with pymssql.connect(server="server",
user="u_id",
password="pwd",
port="port",
database="db") as conn:
generator = pd.read_sql("SELECT * FROM table", conn, chunksize=50000)
#Append the Data to dataframe
for a in generator:
df = pd.DataFrame()
df = df.append(a)
Please let me know if there is a workaround or any other way of reading a large SQL table into a DataFrame without facing memory issues.
Thanks and appreciate your suggestions.
The reported bug only happens if the query results has zero rows, in which case the generator is empty (and in your code example since the generator is empty the loop never executes). For that case the workaround is:
generator = pd.read_sql(query, conn, chunksize=50000)
try:
df = pd.concat(generator)
catch ValueError:
# We know the query has zero rows, so it's safe not to pass a chunksize
# (unless the table has been populated since the first execution of the query)
df = pd.read_sql(query, conn)
My PR for a fix has been accepted, but I don't know when it will be merged I'm afraid.
Could you clarify which line fails?
Could you clarify which line fails?
The last line in except fails
Make sure to set the query variable first, i.e.
query = "SELECT * FROM table"
Make sure to set the
queryvariable first, i.e.query = "SELECT * FROM table"
Thanks Johan :)