Hi,
I am writing some reporting plugin for pytest and I find important that it mentions the assertions on the test, basically saying that it passed and the values involved in the comparison.
Because assert is a statement, we cannot override it or monkey patch to make it run custom reporting code. I also don't want to resort to unittest-style assertion functions as the purpose of pytest after all is making as much boilerplate free as possible.
So I was wondering if pytest has/could have some sort of hook that would be called whenever we have a passing assertion on the test, with the same assertion introspection information as showed when an assertion fails. I suppose this would have do be done around the assertion rewriting.
Can it be done right now with current features and, if not, could you give some directions on where to look for? If my abilities are enough for this I would be willing code it and make a PR if you think it is a worthy feature and everything goes fine.
Thanks!
Checking assertion rewrite code on pytest it seems that it could be relatively easily done by changing rewrite.py::AssertionRewriter in the visit_Assert method:
# Rewrite assert into a bunch of statements.
top_condition, explanation = self.visit(assert_.test)
# Create failure message.
body = self.on_failure
negation = ast.UnaryOp(ast.Not(), top_condition)
self.statements.append(ast.If(negation, body, []))
It appears we could replace the [] by another set of AST with a call to a hook that passes forward the explanation generated for the comparison?
Ok guys another update, I actually have it implemented:
if not (condition) (that raises exception) are duplicated in the else part (only difference is that this calls the hook instead of raising AssertionError). Because they are the same, they could be moved before the if-else clause.pytest_assertion_pass(config, explanation) and so far it receives a config object (in the same way as in the pytest_assert_reprcompare hook), and an explanation string (the same used in the AssertionError message).I don't really know how to write tests to test pytest features themselves so I am going to look into it and make a PR once I have some tests for this new hook.
Please let me know what you think about this hook. Because we find it important, it would be great for us to have it merged into pytest.
Things open to discussion:
@Sup3rGeo Can you upload your code?
I am really interested in having passing assertions (with their values) in the test report.
@manoj23 As you may already know it is on my fork for you to try.
So basically these are the updates:
raise AssertionError if it fails or the hook call if it passespytest_assertion_pass(config, orig, expl): one string with the original assertion statement, and another with pytest explanation. For the first one I had to use astunparse to get code from the AST, and it would have to become a pytest dependency. The alternative would be to return the AST tree, but I think that is too low level.Thanks @Sup3rGeo, I tested this branch https://github.com/Sup3rGeo/pytest/commits/features-assertion-pass-hook and the hook works.
Hey @Sup3rGeo, thanks a lot for tackling this!
I think this would be a great addition to the core.
About the tests: take a look at the test suite, specially those using the testdir fixture: it allows you to test pytest as a "blackbox" by running tests and checking the output. You could for instance write a simple test file and a conftest.py file which implements your hook. The hook dumps its parameters to a file, which you then can check after the test has finished.
Possible drawback is that we are always creating the explanation variables, not only if assertion fails anymore. I don't know if this could be a performance issue.
Usually not, but in some cases it might be a problem (think for example comparing huge strings: this would always produce an expensive diff even when not needed), so we should avoid this if possible.
One solution I can think of is to instead of passing the full explanation to the hook, to instead pass a lazy callable that the hook can call to obtain the explanation (which would then compute it if requested). Something along the lines of pytest_assertion_pass(config, orig, expl_getter). This is not very elegant but the only idea I can think of right now.
@manoj23 great, do you miss anything other than the original assert statement and explanation?
@nicoddemus
Thanks for the tips! What is your opinion about having astunparse as an additional dependency?
I am really having a hard time debugging three tests that are not passing, all of them related to garbage collection:
FAIL testing/acceptance_test.py::test_fixture_values_leak
FAIL testing/python/raises.py::TestRaises::()::test_raises_cyclic_reference[function]
FAIL testing/python/raises.py::TestRaises::()::test_raises_cyclic_reference[with]
I tried to make a minimal example with the problem and reverted my feature branch as much as possible to resemble pytest feature branch. I figured out the problem was on assertion/rewrite.py::AssertionRewriter.visitAssert().
I got the method to this, which causes the problem but if I comment out the self.statements.extend(self.on_failure) then tests pass.
(... same as feature branch ...)
# Rewrite assert into a bunch of statements.
top_condition, explanation = self.visit(assert_.test)
negation = ast.UnaryOp(ast.Not(), top_condition)
err_name = ast.Name("AssertionError", ast.Load())
exc = ast_Call(err_name, [], [])
if sys.version_info[0] >= 3:
raise_ = ast.Raise(exc, None)
else:
raise_ = ast.Raise(exc, None, None)
self.statements.extend(self.on_failure)
self.statements.append(ast.If(negation, [raise_], []))
# Clear temporary variables by setting them to None.
(... same as feature branch ...)
This is the original:
# Rewrite assert into a bunch of statements.
top_condition, explanation = self.visit(assert_.test)
# Create failure message.
body = self.on_failure
negation = ast.UnaryOp(ast.Not(), top_condition)
self.statements.append(ast.If(negation, body, []))
if assert_.msg:
assertmsg = self.helper('format_assertmsg', assert_.msg)
explanation = "\n>assert " + explanation
else:
assertmsg = ast.Str("")
explanation = "assert " + explanation
template = ast.BinOp(assertmsg, ast.Add(), ast.Str(explanation))
msg = self.pop_format_context(template)
fmt = self.helper("format_explanation", msg)
err_name = ast.Name("AssertionError", ast.Load())
exc = ast_Call(err_name, [fmt], [])
if sys.version_info[0] >= 3:
raise_ = ast.Raise(exc, None)
else:
raise_ = ast.Raise(exc, None, None)
body.append(raise_)
# Clear temporary variables by setting them to None.
I have no idea on the reason why it is causing garbage collection problems. I could use some help from more experienced guys.
@manoj23 great, do you miss anything other than the original assert statement and explanation?
Nope, I have everything, thanks. Currently, I am trying to figure out how to insert the assert statement and explanation in the Junit XML report in a specific "passed" node.
Guys, should I create a PR so it is easier to review the code? The same three tests related to garbage collection are failing but should it be easier to talk about this with a PR?
@Sup3rGeo
Nope, I have everything, thanks. Currently, I am trying to figure out how to insert the assert statement and
explanation in the Junit XML report in a specific "passed" node
Actually, I want to write a report of passed asserts with in addition to the assert (orig) and the explanation:
Thus, I could print something like the failure node:
<testcase classname="test_simple_assert" file="test_simple_assert.py" line="4" name="test1"
<...>
test_simple_assert.py:6: AssertionError</failure>
</testcase>
Do you think I can have access to these info?
@manoj23 I updated the hook to receive a item object (instead of config) and also added lineno as an argument. Check if this helps.
Guys, should I create a PR so it is easier to review the code? The same three tests related to garbage collection are failing but should it be easier to talk about this with a PR?
Sure, please don't hesitate to open a WIP PR so we can discuss over the code. 馃榿
Closed by #3479
Hi, I'm just giving my support for this hook. This is really useful for official reporting purposes. Please keep it.
Thanks @kyochujoho, noted. 馃憤
So far we did not have any reports of problems related to the hook, so it will probably have the experimental status changed to official sometime in the future.
The hook seems not working if we have given some log statement along with assert
def test_one():
zlogger.validation('-------------')
assert 1==1, 'Test Passed as 1==1'
def pytest_assertion_pass(item, lineno, orig, expl):
print("asserting that {0}, {1}, {2}, {3}".format(item, lineno, orig, expl))
The above code doesn't print anything but if I remove 'Test Passed as 1==1' then it works.
def test_one():
zlogger.validation('-------------')
assert 1==1
def pytest_assertion_pass(item, lineno, orig, expl):
print("asserting that {0}, {1}, {2}, {3}".format(item, lineno, orig, expl))
-[pytest_assertion_pass]-asserting that <Function test_one>, 15, 1==1, 1 == 1
+1
-1
That's by design @chintanvadgama: pytest only rewrites asserts if they don't include a message (assert [expr]). If they do include a message (assert [expr], [msg]) then the assertion rewriter intentionally doesn't try to rewrite it.
Most helpful comment
Ok guys another update, I actually have it implemented:
if not (condition)(that raises exception) are duplicated in theelsepart (only difference is that this calls the hook instead of raising AssertionError). Because they are the same, they could be moved before the if-else clause.pytest_assertion_pass(config, explanation)and so far it receives a config object (in the same way as in thepytest_assert_reprcomparehook), and an explanation string (the same used in the AssertionError message).I don't really know how to write tests to test pytest features themselves so I am going to look into it and make a PR once I have some tests for this new hook.
Please let me know what you think about this hook. Because we find it important, it would be great for us to have it merged into pytest.
Things open to discussion: