I've got a large number of dynamically generated tests, and I'd like to be able to trigger an xpass inside the test, just like I can call pytest.fail() and pytest.xfail().
But pytest.xpass() doesn't seem to exist
@MichaelClerx do you have an example of what your test generation looks like?
it may be possible to apply the "pytest.mark.xfail" mark to the generated test, it seems odd to me that you only know during the execution of the test that you expected it to fail - but then it didn't
The test checks if a parser handles a list of input files correctly, but there's a few hundred files and some of them are known to fail. The test looks roughly like this:
@pytest.mark.parametrize(('path'), get_valid_files())
def test_valid_file(self, path):
valid = parse(path)
...
if path in known_fails:
if not valid:
pytest.xfail()
else:
# Expected fail, but passed
pytest.xpass()
The function of the xpass here is to notify the user that something has changed in the parsing code. Where we used to have a known failure, it now works unexpectedly
you can use https://docs.pytest.org/en/latest/reference.html#pytest-param on your parametrized paramters with pytest.mark.xfail
I'm assuming "known_fails" is known at collect time:
@pytest.mark.parametrize(
'path',
[
pytest.param(
path, marks=pytest.mark.xfail if path in known_fails else None
)
for path in get_valid_files()
]
)
def test_valid_file(path):
...
Thanks!
Just out of curiosity: Why do I get to pytest.xfail() and pytest.fail() progammatically? Will that be deprecated in the future in favour of marking?
Actually hold on. I can't mark it as an xpass because I'm expecting it to fail. But then it passes unexpectedly, that's what I'm trying to do :-)
Sorry it's a weird one:
an expected failure is to be marked as xfail, the xpass will then automatically happen
fail/xfail raise exceptions, its fundamentally impossible to make a symmetric equivalent for pass/xpass
Just out of curiosity: Why do I get to pytest.xfail() and pytest.fail() progammatically? Will that be deprecated in the future in favour of marking?
so pytest.xfail is already "Noted" as being odd, it's not deprecated: (@RonnyPfannschmidt maybe it should be?)
https://docs.pytest.org/en/latest/reference.html#pytest-xfail
Note
It is better to use the pytest.mark.xfail marker when possible to declare a test to be xfailed under certain conditions like known bugs or missing features.
I personally do not use pytest.fail(), and just use "assert sut(input) == expected", but it can be useful in cases that need a callback function to throw an exception, eg they are in a context that does not permit statements.
I've seen some people do this instead, so they can raise in lambdas or ternaries :
def raise_(E, *args, **kwargs):
raise E(*args, **kwargs)
@MichaelClerx
Sorry it's a weird one:
* I've got a file that _should_ pass validation * I've got a validator that I expect will fail the file * If the file passes, that's odd and I'd like to know about it: that's when I'd like to trigger the unexpected pass
Does this not already fulfil all those 3 use-cases? https://github.com/pytest-dev/pytest/issues/6288#issuecomment-559758327
If not, you may want to set the xfail_strict config option:
http://doc.pytest.org/en/latest/skipping.html#strict-parameter
[pytest]
xfail_strict=true
If so does that fix it for you?
Just out of curiosity: Why do I get to pytest.xfail() and pytest.fail() progammatically?
pytest.fail() and pytest.skip() and pytest.xfail() are there for situations where you don't want to check for a condition at module level, which is when the condition is evaluated for marks.
Consider:
import slow_module_to_import
@pytest.mark.xfail(slow_module_to_import.very_slow_function(), reason="...")
def test_foo():
...
Vs:
def test_foo():
import slow_module_to_import
if slow_module_to_import.very_slow_function():
pytest.xfail(reason="...")
Performance is not the only reason too, sometimes your condition needs to access things which are not available at module import time (access a database, read a config file, etc.)
Will that be deprecated in the future in favour of marking?
No plans at all.