For asynchronous assertions we capture a stack trace before the assertion is run. Inspired by this comment by @alxandr we should explore storing an error object, and reading the stack trace later if it's actually necessary:
… instead of passing the stack trace itself, an error object could be passed. This has less performance impact, because then the stack trace is only generated on demand (this might even be something that could improve the performance of your existing async tests).
See lib/test.js and lib/assert.js.
IssueHunt Summary
IssueHunt has been backed by the following sponsors. Become a sponsor
This may not work on older node versions btw.
Hi, I am new to contributing to here
I would like to work on this bug
I have got an idea of the issue that is present currently, but I might need some help testing the code after I modify it. @novemberborn can you please provide some info.
@monicabhalshankar great, it's all yours!
Could you be more specific in what info you need? Feel free to open a PR with what you've got so far.
@novemberborn so far I went through the 2 files that were linked above
My understanding is we can modify the test.js file to generate the stack trace only when plancount and assert count are equal.
Apart from this I did not find if any other modification can be done
Is this what is expected ? Or I should be considering some other scenarios as well?
@monicabhalshankar this issue links to existing stack-getting-methods. The proposal is to capture the stack in an object, and return that object, rather than returning the stack string. Then elsewhere we access the stack from the returned object, only when needed.
I believe I understand the request. I'm also a beginner, but I'd be willing to assist. @monicabhalshankar let me know if there's anything I can do to help.
@novemberborn i'm a beginner to open source, is this issue still open to work on?would like to make a contribution.
Are you still working on this @monicabhalshankar? Given that you've been quiet here for a while, I hope you don't mind if @Sharan-ram has a go at this as well?
@Sharan-ram would be great if you could give this a go. There's a chance @monicabhalshankar is nearly done with this of course.
Hey is this already done? May I give it a go?
Sorry have been busy for a while, so could not work on the issue.
Please feel free to give it a go
@novemberborn I'm a beginner, and I'd like to work on this issue.
Of course @sainalshah, it's all yours.
@sainalshah Let me know if you'd like to collaborate! I'm also a beginner to open source and would like to contribute as well.
@Jolo510 yeah sure, but how u wanna collaborate?
@sainalshah Sweet! Lets take a look at it individually first then come together and discuss it? We should take this chat off of this thread.
@Jolo510 we can have this chat on hangouts
@sainalshah Sure! Send me an email at [email protected]
Hi, I am new to open source, but I would like to contribute here if possible?
Go for it, @peterr101!
Safe to say no one is working on this?
@Ullauri yea, looks available. Go for it!
Sounds good. I think I understand what this request is asking for. Either way, ill open up a PR for it when I'm done.
@issuehunt has funded $40.00 to this issue.
I'd like to take a stab at this. I will open a PR with some preliminary/not unit tested code to see if I understood the requirement right.
@novemberborn Maybe just the bot is to blame, but shouldn't @mihai-dinu been awarded the bounty for this? Issue is still up on IssueHunt.
@ulken good find. I'm not sure… @sindresorhus?
@sindresorhus has rewarded $36.00 to @mihai-dinu. See it on IssueHunt