Junit5: Introduce Test Kit for testing engines

Created on 2 Apr 2018  路  17Comments  路  Source: junit-team/junit5

Feature Request

I am looking for a straightforward stable API for testing Extensions that would offer deep insights when meta-testing the Extension (think something similar to what is provided via the SummaryGeneratingListener, but with finer grained information reported).

Context here is that testing extensions via meta-testing (having a test that tests other test runs), I found it to be super helpful for getting insights around the test executions that occur based on how my extension may or may not affect them.

ExecutionEventRecorder seems to have some functionality that would be helpful in getting a deep insight to what a custom Extension might be doing, but it looks like it's not published as an API to consumers?

And some digging around led me to find things like ExecutionReportListener exist in the wild (see relevant SO question: https://stackoverflow.com/questions/46841243/how-to-test-extension-implementations), but it still has some holes in it's implementation. For example, it doesn't propagate information like test run duration, start and end times, test hierarchy, etc.

I took a stab at rolling some custom things to test the new Junit5 extension docker-compose-rule in this branch (full PR here: https://github.com/palantir/docker-compose-rule/pull/223) that include the metadata I was referring too above, but wanted to know if there was a better way of doing it, or if something was published already. If not, I would be requesting this functionality from Junit5.

Related Issues

  • #1621
  • #1662
Jupiter Platform Test Kit Vintage new feature

Most helpful comment

Really looking forward to this in https://github.com/junit-team/junit5/pull/1392 - hopefully it gets into 5.3!

All 17 comments

ExecutionEventRecorder seems to have some functionality that would be helpful in getting a deep insight to what a custom Extension might be doing, but it looks like it's not published as an API to consumers?

That's correct. We use it rather extensively internally within the JUnit test suite; however, we have not published it.

We realize that extension authors actually need something along those lines, but we just haven't had the time to polish it, document it, and publish it.

Plus, it might need an overhaul in terms of design in order to serve a wider audience than just the core committers.

If you'd like to a make feature proposals for such an "extension testing framework", feel free!

For future reference, I implemented this in Mockito by using the test engine: https://github.com/mockito/mockito/blob/9e6ccadc896936391ec36734e363239b0eb3d2e8/subprojects/junit-jupiter/src/test/java/org/mockitousage/StrictnessTest.java#L121-L145 We are mostly interested in the test-status, rather than specific details.

The only downside of this approach is the way I detect test methods: https://github.com/mockito/mockito/blob/9e6ccadc896936391ec36734e363239b0eb3d2e8/subprojects/junit-jupiter/src/test/java/org/mockitousage/StrictnessTest.java#L135-L137 Other than that this approach seems to working out for us

@sbrannen , thanks for the insight here. Do you have any guidance for starting the draft of the "extension testing framework"?

I would be more than happy on sketching something out, but wanted to know if there was any specifics that come to mind on your end for the implementation that you would like to see (beyond just the basic contributing guidelines, should it be published to a different package, etc)? That question goes for other potential users like @TimvdLippe (if you have thoughts).

@dotCipher, the best I can recommend is:

  • review your own needs
  • review what can already be done with the ExecutionEventRecorder

    • e.g., by reviewing the API and seeing how it is physically used within JUnit's test suite

  • review what third parties have done

Then document what you feel are required features and nice-to-have features.

should it be published to a different package ... ?

Yes, it would most definitely be published in a different package it its own artifact.

I could imagine the artifact being named junit-jupiter-extension-test or similar.

I'm tempted to call it junit-jupiter-test, but I strongly fear that average developers would think that's need just to write tests and would therefore confuse that with junit-jupiter-api. 馃槆

I guess the real challenge here is drawing the line between testing engines vs. testing extensions solely for the Jupiter extension model.

The ExecutionEventRecorder is for testing any engine.

So we should probably focus on that level first... and then see if it makes sense to introduce dedicated testing support for Jupiter extensions.

Of course, if we focus on testing support for engines, that naturally would not be in an artifact named junit-jupiter-*.

Feedback and ideas are welcome!

@sbrannen, thanks for the advice!

I took a look at the ExecutionEventRecorder and there are some things missing that I would also like to see (ie. timings between tests / containers).

Thanks again for the link to your work on Spring, I will start drafting a bit and see if I can consolidate this functionality at the engine level and ping back on this thread when I have a PR ready.

For timings, I think it would suffice to add a timestamp property to ExecutionEvent.

If it's relevant for further consideration, my SO question (and some answers) about how to test that an extension throws a specific exception: https://stackoverflow.com/q/47237611/4365460

Thanks @rweisleder, that's a useful reference. I actually think I have a draft targeting the engine level (aka engine agnostic). I'll push up the initial draft shortly and it might make sense to update the labels / title of this issue to reflect that?

As an aside, I had a slight difficulty using the third-party eclipse formatter as an Intellij user and apologize in advance I missed any styling / formatting issues in my upcoming PR (more details of my problem in #1374).

I'll push up the initial draft shortly

Looking forward to it!

and it might make sense to update the labels / title of this issue to reflect that?

Let's hold off on that for the time being.

As an aside, I had a slight difficulty using the third-party eclipse formatter as an Intellij user and apologize in advance I missed any styling / formatting issues in my upcoming PR

Well... if you missed anything, the build for your PR will fail on the CI server. So, it's recommended that you execute gradle clean build locally before submitting a PR. 馃槈

First PR is up as #1380 that mainly includes the refactoring / addition of timings. Ideally I would also like to include a union type result at a granularity higher than the ExecutionEvent (think TestExecution) that would include the summary of the execution of a Test (failure / success or skip reason). That PR sets up the basis framework for adding such a thing (via the ExecutionGraph result), so it should be a fairly straightforward feature add.

I would be happy to include that in the #1380 PR or a different one.

Really looking forward to this in https://github.com/junit-team/junit5/pull/1392 - hopefully it gets into 5.3!

FYI: if you're interested in this issue, you may also be interested in checking out my experimental branch for a "fluent API" for the new _Test Kit_: https://github.com/junit-team/junit5/commits/experiments/test-kit-fluent-api

Beware: it's a work in progress.

FYI: I have renamed this issue to focus on the current scope for JUnit 5.4.

If we later decide to introduce explicit support for testing extensions in JUnit Jupiter, we can introduce a new GitHub issue to address that.

Closing this issue since the _Test Kit_ has already been introduced in master.

  • Conversion to a fluent API will be addressed in #1662.
  • Documentation will be addressed in #1621.
Was this page helpful?
0 / 5 - 0 ratings