Jest: Synchronous tests don't time out

Created on 5 Sep 2018  路  18Comments  路  Source: facebook/jest

馃悰 Bug Report

jest.setTimeout(timeout) does not time out for synchronous tests.

To Reproduce

run

  it('times out', () => {
    const startTime = new Date().getTime();
    while (new Date().getTime() - startTime < 10000) {
      // wait
    }
  });

Expected behavior

The test times out.

Link to repl or repo

https://repl.it/@gitlab_winnie/jestsetTimeouttimeout-for-synchronous-tests

Run npx envinfo --preset jest

  Binaries:
    Node: 9.0.0 - ~/.nvm/versions/node/v9.0.0/bin/node
    Yarn: 1.7.0 - /usr/local/bin/yarn
    npm: 5.8.0 - ~/.nvm/versions/node/v9.0.0/bin/npm
  npmPackages:
    jest: ^23.5.0 => 23.5.0

Most helpful comment

but I'm still not sure if enabling it by default would be useful for most users

@rubennorte Maybe we can create a separate follow-up issue to discuss whether this should become the default behavior? Then people can throw their 馃憤s and 馃憥s at it after trying it out. 馃槂

Personally I wouldn't mind if it is not the default but having it at all would be a big help.

All 18 comments

If interrupting the test is technically not possible it would be good to

Current workaround:

let testStartTime;

beforeEach(() => {
  testStartTime = new Date().getTime();
});

afterEach(() => {
  if (new Date().getTime() - testStartTime > testTimeoutInMs) {
    throw new Error(`Test took longer than ${testTimeoutInMs}ms!`);
  }
});

in setupTestFrameworkScriptFile

I don't think it's possible to time out short of killing the worker (with force, and it wouldn't work when not using workers).

But your idea of failing a test if it ran longer than it should regardless if we were able to get a timeout error before it completed or not makes sense! Wanna send a PR for it?

Thank you for the feedback! 馃槂

Wanna send a PR for it?

I'll give it a try. 馃憤

I'm not sure about this. While technically correct, a lot of tests started randomly failing for us as flaky tests and took us hours to determine what caused the flakiness when upgrading Jest.

I think we can keep it for now, as it makes sense, but I'm kind of expecting a pushback and issues from the community in relation to tests that started failing after the upgrade to Jest 24.

Yeah, the react native test in this repo turned flaky as well after merging this. I do like it though, and it makes sense to respect the timeout.

We might consider a custom message if a synchronous test times out, though?

"As of Jest 24, Jest will report synchronous tests that uses more than the timeout to complete as timed out.". Or something like that.

One other thing we can do is include the timeout in the message. Instead of "more than 5000ms" or whatever. As we are unable to cancel the synchronous function it's allowed to complete, sowe know how long it takes.

EDIT: And maybe add "breaking change" to the changelog, even though it's technically a bug fix?

Thanks for the response @SimenB !

Including the time it's taken the test to execute in the error message would be really good, this way it would even be possible to build some tooling to automatically increase the timeout of tests that fail due to this (we've just found a huge amount of failing tests when trying to upgrade to v24)

Hey @SimenB , @gitlab-winnie!

After spending more time trying to integrate the new version of jest into the FB codebase and discussing it with a few teammates (cc/ @mjesun and @rubennorte), I'd like to suggest to add an option to jest to be able to control whether sync tests that take longer than the timeout should fail. Ideally this option should be disabled by default, at least for jest v24.

Context

I've been trying to upgrade jest in our mobile repository at FB and this feature has been causing a few issues:

  1. A bunch of sync tests have started failing.
  2. The list of failing tests is not constant: a lot of tests are now flaky because the time it takes to run them varies a lot between runs.

After some investigation, point 2 is caused by the transpilation of files: since jest transpiles files on the fly as they get required, if when running a test case jest needs to transpile all the dependencies the test is going to take much longer than when the transpiled dependencies are cached.

We've seen cases where transpiling the dependencies takes several seconds but the test only takes a few milliseconds to run, so these tests fail only when running without cache or from a worker that has not transpiled its files yet.

To be fair these same issues are already present on async tests due to the timeout, but in that scenario that's a tradeoff to pay to be able to kill tests that would otherwise run forever due to unresolved promises, etc.

In the sync test scenario, failing when a test has took more than the timeout to run gives consistency with async tests, but it does not improve either developer experience of helps catching bugs: the test will still take the same amount of time to run, only jest will mark it as failed.

This is the main reason why we'd like to be able to not have this check when running. Related to this, I think that many other users will face the same issues that we're facing at FB when upgrading jest (since these issues are not caused by our specific needs), so I believe that making this option opt-in will avoid some user friction until we figure out a better way to implement it.

Alternatives

Something we've discussed is to not count the transpilation times when computing the timeouts: This would remove the flakiness caused by transpilation cache hits/misses, but could cause some confusion since the timeout would not correspond anymore to the total time that it takes to run the test.

Any other alternative or way to improve this feature is appreciated 馃槂

Given that we can't stop sync tests when they timeout and we already have the result when they finish, I don't think this feature is very useful for most people. It might be useful to define "budgets" so you can avoid people writing slow tests (as a team or project policy), but that could be implemented with the workaround @gitlab-winnie defined.

EDIT: And maybe add "breaking change" to the changelog, even though it's technically a bug fix?

@SimenB I think this is a good idea. Even if it is a bug fix, it seems to be a bug that multiple projects relied on.

Could we also introduce a switch to disable the behavior (which may be deprecated in a future release)? I think that would give people more time to fix their long running tests.

@SimenB given that https://github.com/facebook/jest/pull/7259 has been merged, should we reopen this issue? 馃槂

yup

The problem with my workaround from https://github.com/facebook/jest/issues/6947#issuecomment-418786554 is that it doesn't work if you define an individual timeout for anything (afterAll, afterEach, beforeAll, beforeEach, describe.each, test, test.each, test.only)

@SimenB Would merging https://github.com/facebook/jest/pull/7074 again but behind a CLI / config option make sense to you? (something like timeoutSynchronous: true)

Yeah, good idea. And flip the default for a future major, maybe? Dunno.

@thymikee @rubennorte @rafeca thoughts on that?

I'm not opposed to having the option but I'm still not sure if enabling it by default would be useful for most users. I'd find confusing to have something like:

it('...', () => {
  const a = slowCall();
  expect(a).toBe(5);
  console.log(a);
});

Printing 5 and still failing.

Perhaps is just a matter of using a different error message that makes that clear (instead of just a generic timeout error).

but I'm still not sure if enabling it by default would be useful for most users

@rubennorte Maybe we can create a separate follow-up issue to discuss whether this should become the default behavior? Then people can throw their 馃憤s and 馃憥s at it after trying it out. 馃槂

Personally I wouldn't mind if it is not the default but having it at all would be a big help.

Hi linked here from a SO question.

Writing a small instructional on JavaScript and one part is on testing. A sub-section is on performance and Big O.

Is there a version out where I can enable timed synchronous functions?

Was this page helpful?
0 / 5 - 0 ratings