I am seeing failures for a really simple test when running on travis - pytest seems to be being killed during the test collection phase. I have isolated down to running a single test file containing only one test:
import pytest
import sys
import lz4.block
@pytest.mark.skipif(sys.maxsize < 0xffffffff,
reason='Py_ssize_t too small for this test')
def test_huge():
try:
huge = b'\0' * 0x100000000 # warning: this allocates 4GB of memory!
except MemoryError:
pytest.skip('Insufficient system memory for this test')
On travis I see this:
$ pytest tests/block/test_block_2.py
============================= test session starts ==============================
platform linux -- Python 3.6.3, pytest-3.3.0, py-1.5.2, pluggy-0.6.0
rootdir: /home/travis/build/python-lz4/python-lz4, inifile: setup.cfg
collecting 0 items /home/travis/.travis/job_stages: line 57: 25059 Killed pytest tests/block/test_block_2.py
Unfortunately I can't reproduce this locally, so I am at a bit of a loss as to what's happening. Happy to provide more info if needed.
GitMate.io thinks possibly related issues are https://github.com/pytest-dev/pytest/issues/1123 (does pytest support to this way to collect test cases?), https://github.com/pytest-dev/pytest/issues/1978 (pytest not finding any tests under module), https://github.com/pytest-dev/pytest/issues/1563 (All pytest tests skipped), https://github.com/pytest-dev/pytest/issues/2654 (Terminal writing with a single test produces "collected 1 item s"), and https://github.com/pytest-dev/pytest/issues/239 (not collect deselected tests).
@jonathanunderwood at frist clance this looks like the process gets directly killed at collection
this might be a memry constraint, please check with the @travis-ci people
https://docs.travis-ci.com/user/common-build-problems/#My-build-script-is-killed-without-any-error
3gb is the limit, closing as user error
But why is it being killed at collection stage? The body of the test isn't run at that point, is it? I would understand if this was dying when the tests are actually being run, but this is a failure during collection, and this is the only test to be collected, so it's unclear to me why there would be a lot of memory consumption, unless I am hitting some kind of bug in pytest.
reopening since i missed the point
PYTHONUNBUFFERED=x PYTEST_DEBUG=1 pytest tests/block/test_block_2.py should get us more information
further testing revealed python on travis crashing on the file as well
OK, so a quick summary in case anyone else hits this problem.
python tests/test_block_2.py) resulted in the python process being killed (exit code 137). that removed pytest from the picture - I commented out the pytest stuff.Confirmed by having a simple file like this:
def test():
try:
huge = b'\0' * 0x100000000 # 0x100000000 # warning: this allocates 4GB of memory!
except MemoryError:
print('OOM')
and locally running:
valgrind --tool=massif python tests/block/test_block_2.py
ms_print massif.out.7282 | less
Shows that during byte compilation, 4GB of memory is used, even though the function isn't actually called at any point.
So, I'll close this, as this is a Python problem, not a pytest bug.
@jonathanunderwood thanks for the followup as well as calling me out on my mistake earlier :+1:
@jonathanunderwood thanks for the followup as well as calling me out on my mistake earlier
No worries at all, thank you for patiently working with me on IRC this afternoon to get to the bottom of it. I learnt a lot more about the python byte compilation process than I thought I'd ever need to know!
Most helpful comment
No worries at all, thank you for patiently working with me on IRC this afternoon to get to the bottom of it. I learnt a lot more about the python byte compilation process than I thought I'd ever need to know!