Spyder: issue opening the data frame

Created on 10 Oct 2020  路  2Comments  路  Source: spyder-ide/spyder

Description

What steps will reproduce the problem?

pandas.read_csv("read any csv)

Traceback

  File "/Users/rchock001c/opt/anaconda3/lib/python3.7/site-packages/pandas/core/generic.py", line 5272, in __getattr__
    if self._info_axis._can_hold_identifiers_and_holds_name(name):
  File "/Users/rchock001c/opt/anaconda3/lib/python3.7/site-packages/pandas/core/generic.py", line 5272, in __getattr__
    if self._info_axis._can_hold_identifiers_and_holds_name(name):
  File "/Users/rchock001c/opt/anaconda3/lib/python3.7/site-packages/pandas/core/generic.py", line 5272, in __getattr__
    if self._info_axis._can_hold_identifiers_and_holds_name(name):
  [Previous line repeated 992 more times]
  File "/Users/rchock001c/opt/anaconda3/lib/python3.7/site-packages/pandas/core/generic.py", line 493, in _info_axis
    return getattr(self, self._info_axis_name)
  File "/Users/rchock001c/opt/anaconda3/lib/python3.7/site-packages/pandas/core/generic.py", line 5270, in __getattr__
    return object.__getattribute__(self, name)
  File "pandas/_libs/properties.pyx", line 63, in pandas._libs.properties.AxisProperty.__get__
  File "/Users/rchock001c/opt/anaconda3/lib/python3.7/site-packages/pandas/core/generic.py", line 5270, in __getattr__
    return object.__getattribute__(self, name)
RecursionError: maximum recursion depth exceeded while calling a Python object

Versions

  • Spyder version: 4.1.5
  • Python version: 3.7.0
  • Qt version: 5.9.6
  • PyQt5 version: 5.9.2
  • Operating System: Darwin 19.6.0

Dependencies

# Mandatory:
applaunchservices >=0.1.7      :  0.2.1 (OK)
atomicwrites >=1.2.0           :  1.4.0 (OK)
chardet >=2.0.0                :  3.0.4 (OK)
cloudpickle >=0.5.0            :  1.5.0 (OK)
diff_match_patch >=20181111    :  20200713 (OK)
intervaltree                   :  None (OK)
IPython >=4.0                  :  7.16.1 (OK)
jedi =0.17.1                   :  0.17.1 (OK)
nbconvert >=4.0                :  5.6.1 (OK)
numpydoc >=0.6.0               :  1.1.0 (OK)
parso =0.7.0                   :  0.7.0 (OK)
pexpect >=4.4.0                :  4.8.0 (OK)
pickleshare >=0.4              :  0.7.5 (OK)
psutil >=5.3                   :  5.7.0 (OK)
pygments >=2.0                 :  2.6.1 (OK)
pylint >=1.0                   :  2.4.4 (OK)
pyls >=0.34.0;<1.0.0           :  0.34.1 (OK)
qdarkstyle >=2.8               :  2.8.1 (OK)
qtawesome >=0.5.7              :  0.7.2 (OK)
qtconsole >=4.6.0              :  4.7.5 (OK)
qtpy >=1.5.0                   :  1.9.0 (OK)
rtree >=0.8.3                  :  0.9.4 (OK)
sphinx >=0.6.6                 :  3.1.2 (OK)
spyder_kernels >=1.9.4;<1.10.0 :  1.9.4 (OK)
watchdog                       :  None (OK)
zmq >=17                       :  19.0.1 (OK)

# Optional:
cython >=0.21                  :  0.29.21 (OK)
matplotlib >=2.0.0             :  3.2.2 (OK)
numpy >=1.7                    :  1.18.5 (OK)
pandas >=0.13.1                :  1.0.5 (OK)
scipy >=0.17.0                 :  1.5.0 (OK)
sympy >=0.7.3                  :  1.6.1 (OK)
Awaiting Followup

Most helpful comment

Hey @RamG79, thanks for reporting. This looks like an installation issue with Pandas.

To fix it, please open Teminal.app and run there

conda install -f pandas

If that doesn't fix the issue, please run

conda install pandas=1.1

Finally, if the issue persists, I'm afraid you'll have to reinstall Anaconda. For that, please see our video about it.

All 2 comments

Hey @RamG79, thanks for reporting. This looks like an installation issue with Pandas.

To fix it, please open Teminal.app and run there

conda install -f pandas

If that doesn't fix the issue, please run

conda install pandas=1.1

Finally, if the issue persists, I'm afraid you'll have to reinstall Anaconda. For that, please see our video about it.

@ccordoba12 Thank you. "conda install -f pandas" fixed the problem... actually I started seeing this after upgrading the Spyder version from 4.1.4 to 4.1.5 - so looks like during that process some links got messed up. It is working fine now. Appreciate your time and advise. Thanks again.

Was this page helpful?
0 / 5 - 0 ratings