When beforeAll throws an error, or returns a Promise that rejects, the tests are still run. I would expect the tests to not even run at all.
See example repo:
https://github.com/dirkmc/jest-before-all-error-handling
And the output:
> jest __tests__/test.js
FAIL __tests__/test.js
● test › tests 1 === 1
My error
at __tests__/test.js:5:14
at Object.<anonymous> (__tests__/test.js:4:12)
● test › tests 2 === 2
My error
at __tests__/test.js:5:14
at Object.<anonymous> (__tests__/test.js:4:12)
test
✕ tests 1 === 1 (2ms)
✕ tests 2 === 2 (1ms)
Test Suites: 1 failed, 1 total
Tests: 2 failed, 2 total
Snapshots: 0 total
Time: 0.154s, estimated 1s
Ran all test suites matching "__tests__/test.js".
console.log __tests__/test.js:3
before
console.log __tests__/test.js:12
test 1
console.log __tests__/test.js:17
test 2
console.log __tests__/test.js:9
after
npm ERR! Test failed. See above for more details.
This is how jasmine behaves, I'm afraid this is a wontfix for now.
btw cc @dmitriiabramov who wants this changed.
Is there any clean way to abort a test run other than process.exit(1)
?
@tamlyn do you mean --bail
?
No, I mean from within a test suite, I have a condition which means the rest of the file should not run so I want to skip all remaining tests or terminate the process or something. Like the behaviour which the OP expected when an exception is thrown in beforeAll
.
My specific use case is I'm writing some integration tests which hit the DB but I want don't want to run them if there's already data in there.
@tamlyn oh. in this case i don't think there's a clean way of doing this. you can use --runInBand
together with process.exit
.
without --runInBand
, process.exit
will only terminate one of the child worker processes and won't exit the main jest process
Yes, exiting just one worker is what I want as they operate independently. It works with process.exit
so I'll stick with that. Thanks!
Problem with process.exit
is that I don't get any console error messages before exit.
Ya, I agree. I have to say this is an astonishingly bad developer experience for cases that rely on complex setup procedures. We have several projects with integration tests that need to add fixtures to a real database, and this issue makes it very difficult to differentiate between setup errors and failed tests, especially since setup runs independently for each test file.
It looks like this may be a better place to direct our comments, though.
Is this going to be fixed, since both jasmine and mocha have this feature? And how should I use the process.exit
for the work around now?
Here is my code:
beforeAll(() => {
mongoose.connect('mongodb://localhost/test').then(()=>{
mongoose.connection.dropDatabase('test')
}, err => {
process.exit
})
})
doesn't seem to work
stopSpecOnExpectationFailure
option?
PR very much welcome porting the fix from Jasmine: https://github.com/jasmine/jasmine/commit/585287b9d6e03dc546be7269e1ea569f4bd9ca0d
@danger89 stopSpecOnExpectationFailure
seems interesting, kinda related to #5823
Wouldn't it be possible to imitate this behavior with return
?
This is fixed in Jest circus, we won't be fixing it for Jasmine. If somebody sends a PR that's fine, but I'm going to close this as we won't get to it
Sorry, what's the relation with jasmine here? For people who are _not_ using jasmine, failures in beforeAll
et al still seem to require a try/catch + process.exit.
@rattrayalex-stripe the old test runner for Jest (which is still currently the default) was powered by a jasmine fork called jest-jasmine2. We've written our own runner called jest-circus, which we're migrating everything to
We fixed this issue in the jest-circus runner, and wont be fixing it in the jest-jasmine2 runner (my understanding is that it's a pretty tough one to fix in the jasmine runner)
Ah, gotcha, thanks – guess if I'd been more patient on my request for a jest-circus readme I'd have found that out 😅 thanks @rickhanlonii !
Yup! Also see https://github.com/facebook/jest/issues/6695#issuecomment-405326998
If you're looking for information about when Jest Circus is going to be released, see #6295. 🎪
We fixed this issue in the jest-circus runner, and wont be fixing it in the jest-jasmine2 runner (my understanding is that it's a pretty tough one to fix in the jasmine runner)
In what version and PR was it fixed? I've tried installing jest-circus
from npm and setting JEST_CIRCUS=1
, but the behavior is the same: an exception in beforeAll
does not prevent the tests from running.
It turns out that you can expect that behavior if:
jest-circus
npm packageJEST_CIRCUS=1
environment variable when running jest
I'm surprised and bummed that this behavior is not applied by default in Jest.
jest-circus
stops tests from being run if a beforeAll
throws, but it doesn't stop subsequent beforeAll
s from running! eg, changing the reproduction code to:
describe('test', () => {
beforeAll(() => {
console.log('before');
return new Promise((resolve, reject) => {
reject(new Error('My error'))
})
})
beforeAll(() => {
console.log('before 2');
})
afterAll(() => console.log('after'))
it('tests 1 === 1', () => {
console.log('test 1')
expect(1).toBe(1)
})
it('tests 2 === 2', () => {
console.log('test 2')
expect(2).toBe(2)
})
})
With jest-circus
, before 2
prints.
It's pretty surprising that there's a difference between:
beforeAll(() => {x(); y();});
and
beforeAll(() => { x(); });
beforeAll(() => { y(); });
I can understand why you would maybe want this behavior, but it would at least be nice if beforeAll() functions can check a flag like "are we actually going to run tests or have we already failed"?
Seems like transitioning to jest-circus is ongoing: https://github.com/facebook/jest/issues/6295
Most helpful comment
Ya, I agree. I have to say this is an astonishingly bad developer experience for cases that rely on complex setup procedures. We have several projects with integration tests that need to add fixtures to a real database, and this issue makes it very difficult to differentiate between setup errors and failed tests, especially since setup runs independently for each test file.
It looks like this may be a better place to direct our comments, though.