Pytest: How to issue a clean session-wide warning at the end of pytest run?

Created on 7 Mar 2019  路  7Comments  路  Source: pytest-dev/pytest

I'm trying to run certain checks on the whole test suite and print some warnings at then of pytest. I'm trying to do warn() form a session-scoped fixture and it looks very messy.

Here is an example to reproduce:

cd /tmp
mkdir tests
echo -e "import pytest\nfrom warnings import warn\[email protected](scope='session', autouse=True)\ndef run_check(request): yield; warn('\\\\n\\\\n*** This is global warning ***\\\\n')" > tests/conftest.py
echo -e "def test_a(): assert True\ndef test_b(): assert True\ndef test_c(): assert True" > tests/test_1.py
echo -e "def test_a(): assert True\ndef test_b(): assert True\ndef test_c(): assert True" > tests/test_2.py
pytest tests

The key being:

# cat tests/conftest.py
import pytest
from warnings import warn
@pytest.fixture(scope='session', autouse=True)
def run_check(request): yield; warn('\n\n*** This is global warning ***\n')

This gives after all tests have finished:

collected 6 items

tests/test_1.py ...                                                                                                                                                  [ 50%]
tests/test_2.py ...                                                                                                                                                  [100%]
============================================================================= warnings summary =============================================================================
tests/test_2.py::test_c
  /tmp/tests/conftest.py:4: UserWarning:

  *** This is global warning ***

    def run_check(request): yield; warn('\n\n*** This is global warning ***\n')

-- Docs: https://docs.pytest.org/en/latest/warnings.html

Results (0.11s):
       6 passed

There are so many wrong things about this, which make it very difficult to see the actual warning and understand it:

  • it prints the warning from the perspective of the last test, which has nothing to do with it.
  • It prints the code that generates the warning
  • It prints a link to docs, which are very irrelevant here.

How can I issue a global warning for pytest to print at the end of its run? print doesn't work as pytest "eats" its output.

My ideal result would be:

collected 6 items

tests/test_1.py ...                                                                                                                                                  [ 50%]
tests/test_2.py ...                                                                                                                                                  [100%]
============================================================================= warnings summary =============================================================================

  *** This is global warning ***

Results (0.11s):
       6 passed

but anything better than what I get now is welcome.

Perhaps there is a way that I couldn't find in docs?

Thank you.

question

All 7 comments

Hi @stas00,

What do you need in order to do your checks? Depending on what checks you are doing, you may use pytest_terminal_summary.

Awesome, it'd have been nice to have examples of how to use those functions, but I found this SO answer with example: https://stackoverflow.com/a/49450167/9201239 so I was able to accomplish what I needed doing with this:

import pytest
@pytest.hookimpl(hookwrapper=True)
def pytest_terminal_summary(terminalreporter): 
    yield
    print('\n\n*** This is global warning ***\n')

Thanks a lot, @nicoddemus!

If you don't mind that your warning may appear before other warnings, you can avoid making a hook wrapper. I also suggest using terminalreporter.write:

def pytest_terminal_summary(terminalreporter): 
    terminalreporter.section('global warnings')
    terminalreporter.write('some warning 1')
    terminalreporter.write('some warning 2')

I'm appreciating you sharing potentially better ways of doing it, @nicoddemus.

I'm yet to understand the nuances of pytest hook wrappers, is there any reason I should avoid using it? So far it has been blindly copying examples and adapting them to our needs.

I do like having it written after all other warnings if any. This is basically a summary of some recommendations that the finalizer needs to tell the test writers to follow on their tests if it detects something is amiss, so when it appears last it is most likely to be noticed if there are other warnings.

If you're curious, we implemented a test registry, where each test declares which APIs it exercises, and then we use that data in the documentation, linking back from each API entry back to the tests that exercise it - so that it serves as both - extra usage examples and also an indication where a user could contribute a test if it's missing.

So at the end of pytest run, we report any tests that forgot to declare what API they test.

For example if you click on https://docs.fast.ai/basic_train.html#Learner.save and then click on [test], it'll show:

Tests found for save:

    pytest -sv tests/test_basic_train.py::test_save_load
    [source]
    pytest -sv tests/test_basic_train.py::test_memory
    [source]
    pytest -sv tests/test_vision_train.py::test_model_save_load
    [source]

To run tests please refer to this guide.

links got stripped in copy-n-paste, but you will see them if you click on the doc page above.

I'm yet to understand the nuances of pytest hook wrappers, is there any reason I should avoid using it? So far it has been blindly copying examples and adapting them to our needs.

None, it is just slightly simpler. My intention was just to point out a simpler implementation in case it would also fit your use case, that's all. If the hook wrapper works better for you, please use it. 馃憤

So at the end of pytest run, we report any tests that forgot to declare what API they test.

That's an interesting use case, I'm glad you have been able to use pytest for that. If you encounter any other roadblocks, please do post an issue and we will be glad to help.

Much appreciated, @nicoddemus! Makes me feel welcomed!

Was this page helpful?
0 / 5 - 0 ratings