faq label _Applicable issue #2713 is closed_node node_modules/.bin/mocha --version(Local) and mocha --version(Global). We recommend that you _not_ install Mocha globally. _not installed globally_Mocha exits with exit code 0 despite having failing tests
Visually confirmable by truncated output. Does not print all test lines before printing Done in x. Last test line printed varies run to run.
Working to determine outside of our test suite. Running sub-suites alone seems to work fine, but running the entire suite does not. Possibly a timing or issue with high test counts?
Expected behavior:
Mocha reports a non-zero [success] exit code
Actual behavior:
Mocha reports a 0 exit code and does not complete console output.
Test Output
$ yarn mocha './mochaWhy/*.test.js'
yarn run v1.9.4
$ C:\projectDir\node_modules\.bin\mocha ./mochaWhy/*.test.js
1ListBuilder
1) fails a test before compliance check
2) creates the links
3) fails a test after compliance check
dataBuilder - Empty
โ It does not create uneccessary data
Compliance
โ Files with bare bones data are compliant (1196ms)
โ Files with simple data are compliant (46ms)
โ Files with bare bones data are compliant (152ms)
โ Files with simple data are compliant (95ms)
orders
retrieveFile
4) handles arrays
โ handles an empty array of uuids
5) throws validation error
retrieveById
6) Finds info, and pulls the associated info
7) Handles b
โ Handles a
save
โ handles empty arrays gracefully
8) calls entity validation
9) doesn't call db-update upon validation failure
Dispatch Sequence
10) Queues b
11) Handles z
12) Doesn't allow x
13) Doesn't allow y
14) Doesn't create g
15) Throws an error if w
16) Throws an error if t
17) Allows partial success when p
structureData
It makes sure 8
โ Handles no
โ Zero-Index input IDs
โ One-Index input IDs
โ Null Ids
โ Skipped Id
โ Mixed Values
Interactor
parseAndSaveJson
- handles a
- handles b
- handles c
- handles d
- not archived if origin is API
- Can determine set
dataInteractor
unzipConvert //not the last test in the suite, but is the last printed.
Done in 4.45s.//[this should be failing count & error details]
$ echo $?
0
Reproduces how often:
Currently, every time I run the full suite. This has been a recurring problem for us, sometimes upgrading node has helped, but have not found a solution this time, most likely was a bandaid for the problem.
Mocha: 3.5.3 and 6.1.4
Node versions 8.10, 8.11, 8.12, 8.16, 10.0, 10.16 (In the past upgrading node 8.9 to node 8.10 seemed to fix, but no longer is a solution)
Gitlab runner: docker:latest node:8
Ubuntu 16.4 64bit
Windows 10 64bit - Git bash MINGW64 & Windows powershell
https://github.com/mochajs/mocha/issues/2713
https://github.com/mochajs/mocha/issues/188
https://github.com/mochajs/mocha/issues/187
https://github.com/mochajs/mocha/issues/2438#issuecomment-247223269 in discussion, not main post
I've been going through test combinations trying to pin stuff down, I currently have a it blocked to a set that seems to return a proper exit code or not on a whim. Same set of tests, same environment, one run I'll get 0 (incorrect) the next I'll get the error count exit code (expected).
Appreciate your efforts thusfar, but without an MCVE there's not much we can do.
You also seem to have quite the stack there, which needs to get paired down to _just_ Mocha; we ~can't~ won't debug your Window's MINGW install running Docker with various Ubuntu-based Node images.
I understand, and if I can pinpoint a reliable MCVE I'll post one, it's just taking a while to whittle things out. As a clarification, the environments listed were multiple envs and tools checked to eliminate that as the root cause.
Please try to run your tests - instead of .bin\mocha - with the binaries node bin\mocha and node bin\_mocha. If there is a different output, it could be a problem with the child process.
Please try to run your tests - instead of
.bin\mocha- with the binariesnode bin\mochaandnode bin\_mocha. If there is a different output, it could be a problem with the child process.
@juergba, "_\
@nwesterman, do your failures repeat to same test case(s)? Are the failing cases async-related? Are you stubbing process.stdout.write or console.log anywhere?
I was able to get a subset of tests arranged that I'm free to upload, and reproduce the issue. This specific arrangement has been giving me an exit code of 3, while printing error lines for a dozen failed tests.
If it does not reproduce on other systems I likely trimmed too much, since the sheer number of tests & time to run them seems to affect reproducability (this has been true for awhile, and is likely why upgrading node in the past helped, as the runtime improved).
Tests Files are numbered to mess with run order. https://github.com/nwesterman/mochaExitCode_3893
do your failures repeat to same test case(s)? Are the failing cases async-related? Are you stubbing process.stdout.write or console.log anywhere?
@plroebuck The assertions pass/fail as I would expect. Some of the assertions in our project are in async tests, but none of the ones in the posted subset do. We do not stub process.stdout.write or console.log
Please try to run your tests - instead of .bin\mocha - with the binaries node bin\mocha and node bin_mocha. If there is a different output, it could be a problem with the child process.
@juergba I did not notice any difference besides the test runtime not printing (e.g. Done in 4.45s) using node_modules/mocha/bin/mocha I could not locate a bin\_mocha I do not have Mocha installed globally if that is what you were referring to.
Our current work around to this issue is running files separately as its only when we run a significant number at the same time that the error occurs. When run alone each suite works as expected.
Any updates here? Did you review my results?
Created a gitlab CI, succeeded like your run. Duplicated a number of the tests, have it reproducing.
Run output: https://gitlab.com/nicwest/mochaexitcode_3893/-/jobs/208856266
New Repo: https://gitlab.com/nicwest/mochaexitcode_3893
Still WOMM. Same steps as before. Results from GitLab repo. Now reports 19 errors instead of 12.
$ node --version
v10.15.0
Between running node 10 and your computer resources, the test set run faster and complete. Which is why I pushed everything up to Git to run with their CI for reproducability. Also pushed your organization changes off a branch and onto master.
Nit: package.json
This is unnecessary with tests in "test" directory. Simplify
"test": "mocha './{,!(node_modules)/**/}*.test.js'",
to
"test": "mocha",
Nit: README.md
Simpler as:
Run command: `$ ./node_modules/.bin/mocha`
Between running node 10 and your computer resources, the test set run faster and complete. Which is why I pushed everything up to Git to run with their CI for reproducibility. Also pushed your organization changes off a branch and onto master.
So if I understand you, this problem only happens if your ".gitlab-ci.yml" uses image: node:8.
Changing to image: node:10 (or later) gives the same results as I get, right?
Where would you like to go with this now? I'm not going to trawl all the possible changes across major Node versions (and affected packages).
Trawling node versions doesn't change anything its a band-aid over the problem. The previously "resolved" ticket was on node 7.2, and our group has seen the issue from node 8 up through node 12
Node 10 branch: https://gitlab.com/nicwest/mochaexitcode_3893/tree/node10
Node 10 output: https://gitlab.com/nicwest/mochaexitcode_3893/-/jobs/221718456
Node 12 branch: https://gitlab.com/nicwest/mochaexitcode_3893/tree/node12
Node 12 output: https://gitlab.com/nicwest/mochaexitcode_3893/-/jobs/221761744
Only additional change from the node 8 branch, repeated tests to extend the run. I'd created master on node 8 to reduce the code needed to reproduce the issue.

This happens with cypress.
npm i to install packages.packages/reporter and write a failing test in 'header/header.spec.jsx' file like: 'expect(false).to.be(true)`. (This step is added because I fixed the bugs and they might not exist when you cloned the repo.)npm test.I found this bug because it happened in cypress CircleCI. Check cypress-io/cypress#5770
@nwesterman It seems to be a reporter problem.
First I removed the MaxListenersExceededWarning's by setting
process.setMaxListeners(0) in "lib/cli/run.js"process.stdout.setMaxListeners(0) in "lib/reporters/base.js"The warnings disappear, but still the tests don't run completely.
When I use a different reporter like "json" or "min", then the tests seem to run successfully to the end with exitCode 19.