Node: keypress events not working in windows cmd.exe & powershell

Created on 23 Feb 2016  Â·  90Comments  Â·  Source: nodejs/node

  • Version: 5.7.0
  • Platform: windows 10 x64
  • Subsystem: streams?

I make commits using cz-cli (it uses arrow keys for selection). Just updated node to 5.7.0 (from 5.6.0) on windows 10 and now arrow keys (upd also Ctrl+C and maybe something other) not working in command line. Pressing arrow keys does nothing (like keypress event doesn't happen).

Guess it related https://github.com/nodejs/node/issues/2996.

confirmed-bug tty windows

Most helpful comment

The issue is fixed with v6.2.0.

All 90 comments

@egoroof What terminal are you using?

CMD
upd: PowerShell the same results

possibly related report from nodeschool https://github.com/nodeschool/discussions/issues/1641

someone needs to bisect this one, @nodejs/platform-windows can anyone repro?

someone needs to bisect this one

Can I help somehow? Maybe I should run some tests to get more info about the bug?

@egoroof if you can set yourself up with a development environment to compile and test Node on your Windows machine then you could help track down exactly where the bug was introduced. git bisect is a great tool to learn if you haven't before, you need to come up with a good test, manual or automatic, to tell you if a version of Node is _broken_ or not then find a spot in the git history where it's not broken then use git bisect to identify the actual commit where the breakage started and we can go from there.

Unless anyone in here has a better idea of where we can narrow this down to.

@egoroof if you could reproduce without external dependencies like commitizen that would be great. In the referenced issue, there are already examples. I can reproduce some. Trying to get gdb running on windows and hunt this bug down, too.

Here is the test:

// pressing any key should get the proccess exit
// without the fix you will have to press enter to do it
// it can show you false positives sometimes
// run the test many times to get high probability that it is true passing
process.stdin.setRawMode(true);
process.stdin.on('data', function (char) {
    console.log('Exiting on any key press...');
    process.exit();
});

Here is the results (which I tested with cmd.exe):

works:

  • 5.5.0
  • 5.4.1
  • 5.3.0
  • 5.2.0
  • 5.1.1
  • 5.1.0
  • 4.3.1

doesn't work:

  • 5.7.0
  • 5.6.0
  • 5.0.0

It seems it was originally fixed in 5.1.0 (https://github.com/nodejs/node/commit/af46112828cb28223050597f06b5e45a659e99d6). Then it returned back in 5.6.0.

if it is to do with streams then it may be in one of the commits that have touched the streams code since v5.5.0:

7684b0f stream: fix no data on partial decode
f706cb0 streams: 5% throughput gain when sending small chunks
ee8d4bb stream: prevent object map change in TransformState
c8b6de2 stream: refactor redeclared variables

Only the last two in that list were in between v5.5.0 and v5.6.0. You could try backing them each out and see if it makes a difference.

Also, /cc @nodejs/streams

the second to last one is the only one that actually changes anything so I somewhat doubt the problem lies there

Ok. I find this bug is complicated. The test is manual and it can show you false positives sometimes. So I run the test many times to get high probability that it is true passing.
Moreover I find that this test have more chances to be passed if you machine uses lots of resouces. I just run PhpStorm IDE and during the starting of the IDE I run test many times and in most cases the test was passed. Then when the IDE is loaded the test doesn't pass in most cases.

Bisect result is: f1a082741743f51112bbf14d4d300f30e99d54d3 deps: sync with upstream c-ares/c-ares@2bae2d5.

To be sure that this is the cause:
git checkout nodejs/v5.x
git revert -n f0bd176d6d5523626efb3005cf41f11663bde280
git revert -n b46f3b84d4a35c9bf75a30dcfd68d8a3317713db
git revert -n f1a082741743f51112bbf14d4d300f30e99d54d3
vcbuild nosign

And do testing. Seems to work (I didn't find failures with the test).
See https://github.com/nodejs/node/pull/5090 about these commits.

Can someone create automated test for it? I have no idea how.

Thanks for investigating @egoroof, what we first need to do is figure out how on earth c-ares could be implicated on processing stdin on Windows, and if that is indeed the cause of the bug, how do we fix it?

/cc @indutny @saghul (since you merged and reviewed that patch ... just in case you might have a clue here)

Wow, this is a bit unexpected. I will take a look.

That bisect result is weird. c-ares is not related in any way to how key presses are handled, which is done in libuv's tty handle.

Unless it's some bizarre define?!

From https://github.com/nodeschool/discussions/issues/1641#issuecomment-189219848:

I think I've gotten to the bottom of this. It seems that the problem with the non-responsive menus only occurs when I have an instance of the Microsoft Edge browser running. When I kill all instances of Edge, the tutorial menus do work.

There's a network connection ... doesn't help much but perhaps adds a little bit of credence to the c-ares implication.

The last issue didn't manifest on Windows 7. I think only 10 and possibly 8 were affected. Can someone verify?

Tried to reproduce on Windows 10, Node.js 5.7.0 to no avail. @egoroof is it possible that your bisect is in error? I find it very hard to believe that a update to c-ares is really the cause.

@silverwind maybe. I'll try to do more tests and another bisect. But the fact that reverting those commits makes it working seems to be true.

@silverwind just finished new bisect. Got the same result:

E:\Dropbox\repo\node>git bisect start
E:\Dropbox\repo\node>git bisect bad
E:\Dropbox\repo\node>git bisect good v5.5.0
Bisecting: 152 revisions left to test after this (roughly 7 steps)
[a3a0cf603a2581e3641ddd44015d07d6d367de2c] tools: add arrow function rules to eslint

E:\Dropbox\repo\node>git bisect bad
Bisecting: 75 revisions left to test after this (roughly 6 steps)
[a2881e2187d8b924d09dedf5f091fbacf7323aa7] test: remove test-cluster-* var redeclarations

E:\Dropbox\repo\node>git bisect good
Bisecting: 37 revisions left to test after this (roughly 5 steps)
[dde160378e9f7de3b71014448da3613967502610] doc: fix link in cluster documentation

E:\Dropbox\repo\node>git bisect good
Bisecting: 18 revisions left to test after this (roughly 4 steps)
[ec62789152d4cebdeed64e49cee73a0c131dcd88] crypto: fix memory leak in LoadPKCS12

E:\Dropbox\repo\node>git bisect good
Bisecting: 9 revisions left to test after this (roughly 3 steps)
[8e579ba759b9964adf6b6029d1c37d4468e2f5d3] http: strictly forbid invalid characters from headers

E:\Dropbox\repo\node>git bisect bad
Bisecting: 4 revisions left to test after this (roughly 2 steps)
[c4c8b3bf2ea37c972b07ad820987549f1861ba34] doc: fix dgram doc indentation

E:\Dropbox\repo\node>git bisect bad
Bisecting: 1 revision left to test after this (roughly 1 step)
[b46f3b84d4a35c9bf75a30dcfd68d8a3317713db] src,deps: replace LoadLibrary by LoadLibraryW

E:\Dropbox\repo\node>git bisect bad
Bisecting: 0 revisions left to test after this (roughly 0 steps)
[f1a082741743f51112bbf14d4d300f30e99d54d3] deps: sync with upstream bagder/c-ares@2bae2d5

E:\Dropbox\repo\node>git bisect bad
f1a082741743f51112bbf14d4d300f30e99d54d3 is the first bad commit
commit f1a082741743f51112bbf14d4d300f30e99d54d3
Author: Fedor Indutny <[email protected]>
Date:   Thu Feb 4 16:34:47 2016 -0500

    deps: sync with upstream bagder/c-ares@2bae2d5

    PR-URL: https://github.com/nodejs/node/pull/5090
    Reviewed-By: Saúl Ibarra Corretgé <[email protected]>

:040000 040000 9cde32ee881592e87cc8f9631ec1ec4aa316ed0a 158a3bae5fb407f6d7212d249542600b5b3ad802 M  deps
:040000 040000 ec2de4335873088bb0c5776400bb40eba289ad9d 3f6e5e0926a51b388849d034fa3faa3b3803ee36 M  src

Hmm, maybe we can provide some kind of test build with that commit reverted so people who can reproduce can test it?

I've done the bisect like @egoroof (Windows 10)

Here my results:

git bisect start
# good: [dd882563e56652f76e25994d046ab49b1bc5f836] 2016-01-20, Version 5.5.0 (Stable)
git bisect good dd882563e56652f76e25994d046ab49b1bc5f836
# bad: [645d4d5d593c3822e567efec47ac375cfd86b83f] 2016-02-09, Version 5.6.0 (Stable)
git bisect bad 645d4d5d593c3822e567efec47ac375cfd86b83f
# good: [612ce66c7816193d43c435336216364a12c3dd53] net: refactor redeclared variables
git bisect good 612ce66c7816193d43c435336216364a12c3dd53
# good: [87b27c913d5c292b36ebde26928b157b82649341] test: fix redeclared test-intl var
git bisect good 87b27c913d5c292b36ebde26928b157b82649341
# good: [95615196de77aedebc921dd52c82f63fd8f9e099] src: clean up usage of __proto__
git bisect good 95615196de77aedebc921dd52c82f63fd8f9e099
# bad: [b46f3b84d4a35c9bf75a30dcfd68d8a3317713db] src,deps: replace LoadLibrary by LoadLibraryW
git bisect bad b46f3b84d4a35c9bf75a30dcfd68d8a3317713db
# good: [9f7aa6f8686ccdcbb3fd012f700a5e551445d30d] doc: clarify dgram socket.send() multi-buffer support
git bisect good 9f7aa6f8686ccdcbb3fd012f700a5e551445d30d
# good: [d9e934c71f1c2b87bb837ac808204391c794c95b] crypto: add `pfx` certs as CA certs too
git bisect good d9e934c71f1c2b87bb837ac808204391c794c95b
# bad: [f1a082741743f51112bbf14d4d300f30e99d54d3] deps: sync with upstream bagder/c-ares@2bae2d5
git bisect bad f1a082741743f51112bbf14d4d300f30e99d54d3
# good: [ec62789152d4cebdeed64e49cee73a0c131dcd88] crypto: fix memory leak in LoadPKCS12
git bisect good ec62789152d4cebdeed64e49cee73a0c131dcd88
# first bad commit: [f1a082741743f51112bbf14d4d300f30e99d54d3] deps: sync with upstream bagder/c-ares@2bae2d5

I reverted a small hunk in c-ares and the problem disappears.

Here the changes that I've done on f1a082741743f51112bbf14d4d300f30e99d54d3

diff --git a/deps/cares/src/ares_init.c b/deps/cares/src/ares_init.c
index 4607944..4c7eaf6 100644
--- a/deps/cares/src/ares_init.c
+++ b/deps/cares/src/ares_init.c
@@ -1068,6 +1068,10 @@ done:
  */
 static int get_DNS_Windows(char **outptr)
 {
+  /* Try using IP helper API GetAdaptersAddresses() */
+  if (get_DNS_AdaptersAddresses(outptr))
+    return 1;
+
   /*
      Use GetNetworkParams First in case of
      multiple adapter is enabled on this machine.
@@ -1078,10 +1082,6 @@ static int get_DNS_Windows(char **outptr)
   if (get_DNS_NetworkParams(outptr))
     return 1;

-  /* Try using IP helper API GetAdaptersAddresses() */
-  if (get_DNS_AdaptersAddresses(outptr))
-    return 1;
-
   /* Fall-back to registry information */
   return get_DNS_Registry(outptr);
 }

But I've still not understood the real bug. It seems that get_DNS_NetworkParams() is doing something that breaks the first keypress event.

It needs more investigations. Just a revert of this part is an hack, not a fix.

I also fail to see how these are related. @Skywalker13 what test did you use to determine if a revision was good or bad?

I use my project https://github.com/Xcraft-Inc/shellcraft.js with the example. It just an inquirer.js prompt (related bug was https://github.com/nodejs/node/issues/2996 reported by the inquirer's author).

Then my "test" is manual.
Sorry

Ideally this should be tested with something which doesn't do any networking or DNS queries, to take c-ares out of the picture.

This test could be a good start. This doesn't mean there isn't another DNS problem related to the c-ares update though.

@saghul I confirm the behaviour with the small https://github.com/nodejs/node/issues/5384#issuecomment-188184941 script. The revert of https://github.com/c-ares/c-ares/commit/52ecef76df077b393d4108d5275e9b4f4622ddce fixes (hides?) the bug.

It's very difficult for me to test with #5384 (comment) because the bug appears maybe only 1/20 with this script...
It smells the race condition

I don't understand. You said the above, which to me it means the bug is not related to c-ares, since it also happened with prior versions of Node. If the test is unreliable, but equally unreliable on older Node verions I'd say we're looking in the wrong place.

It's very difficult for me to test with #5384 (comment) because the bug appears maybe only 1/20 with this script...
It smells the race condition

I don't understand. You said the above, which to me it means the bug is not related to c-ares, since it also happened with prior versions of Node. If the test is unreliable, but equally unreliable on older Node verions I'd say we're looking in the wrong place.

I drop this previous message because it was just a stupid mistake (not related to node, but to me).
With the small script it fails every time. The 1/20 it was because I pressed accidentally return instead of an other key. Then, please. forget it

Ok.

I've built node in Debug and tried to debug with VS2015.

When node is built in debug, it fails every time, even if https://github.com/c-ares/c-ares/commit/52ecef76df077b393d4108d5275e9b4f4622ddce is reverted.

Then it looks more interesting. I can try a bisect by testing in Debug instead of Release. Maybe it will stop on an other commit.

Just for informations..

I've tried the versions (debug on Windows 10 with vc140) of iojs v2, v3 and node v4 and v5 (the problem exists everywhere).

Now I'm trying to build node 0.12.9 because I'm still unable to found at least one release where it works.

Just built debug with VS2013 versions v5.1.0 and v5.5.0 both fails every time.

Also with debug build I see big lag before first char comes from pressing a button to display in console.

I've tried 0.12.9 and 0.10.42 with debug and the problem is always the same. It just doesn't work at all.

@egoroof I've this lag too, I think that it's just because the debug symbols and compiler settings add a significant overhead.. it's not really a surprise for a debug build.

IMHO the problem is finally something like a race condition because when it works, it's faster (the Release build is much faster for the init)
And for example the revert for c-ares is just a side effect because it changes the speed of the node init.

An idea.. we should try to build in release and add a sleep in the init (for example where cares is initialized). Maybe just changing that will reproduce the bug in Release build too.

@Skywalker13 I agree with "race condition". As I said before:

Moreover I find that this test have more chances to be passed if you machine uses lots of resouces. I just run PhpStorm IDE and during the starting of the IDE I run test many times and in most cases the test was passed. Then when the IDE is loaded the test doesn't pass in most cases.

A race condition in stdin handling was my suspicion too last time, by the way.

I've more info now.. It seems that the race is in node/deps/uv/src/win/tty.c

I've put a breakpoint in static DWORD CALLBACK uv_tty_line_read_thread(void* data) { in order to understand.

The thread calls

if (ReadConsoleW(handle->tty.rd.read_line_handle,
                   (void*) utf16,
                   chars,
                   &read_chars,
                   NULL)) {

and it's blocked, an other thread uses the function int uv_tty_set_mode(uv_tty_t* tty, uv_tty_mode_t mode) {

And changes the console mode:

  if (!SetConsoleMode(tty->handle, flags)) {

With the right flag

case UV_TTY_MODE_RAW:
      flags = ENABLE_WINDOW_INPUT;
      break;

It's the process.stdin.setRawMode(true);

But it's too late.. the other thread is already blocked on ReadConsoleW.

@Skywalker13 interesting findings! The tty code tries to cancel the in progress read when switchign modes, so that should have worked... maybe @orangemocha can help?

This flag UV_HANDLE_READING is not set (with my tests and when the thread is already blocked on ReadConsoleW)

in uv_tty_set_mode

  /* If currently reading, stop, and restart reading. */
  if (tty->flags & UV_HANDLE_READING) {

Happy to help, of course. Let me try to reproduce your findings....

With the test given at https://github.com/nodejs/node/issues/5384#issuecomment-188184941, I can reproduce that keypress events early in the process lifetime sometimes get processed in normal non-raw mode. I am not sure if that captures the entire issue, but at least for that part I think I have an explanation.

When process.stdin first gets accessed, a read is started and then stopped soon after in the same initialization function. The read ends up queueing a work item to the thread pool to read the console with ReadConsoleW. If a keypress was already in the system buffer, it will get consumed at this point.

I am thinking that tty.ReadStream should be created with pauseOnCreate : true. I am testing a fix.

Same issue here. I'm just checking in on the fix @orangemocha is testing.

Still working on it. I got sidetracked by other issues, sorry. The fix I came up with is https://github.com/orangemocha/io.js/commit/979fd95c38cb07ffaf7e18be84e39f14da695d60. It solves the keypress issue as I understood it, but it now causes new failures in the stdin tests (eg testparalleltest-stdin-resume-pause.js). I hope to make more progress this week.

Adding land-on-v5.x as it looks like this needs to land in v5.x yet.

The fix had an unwanted side effect (https://github.com/nodejs/node/issues/5927), so I removed the land-on-v5.x label.

Reopen this issue plz, because people think it's already fixed (https://github.com/nodejs/node/issues/5940)

I can replicate this issue on windows 10 x64, node v5.8.0

I can replicate this issue on nodejs v5.10.0 with Windows 10 x64 cmd using yo cli. Sometimes the arrow keys do work. Why is it so unpredictable?

Able to reproduce on Windows 10 Home 64-bit

issue occurrs when using CMD, Babun, and Powershell
node v5.10.1
npm v3.8.3
bower v1.7.9
yo v1.7.0
gulp CLI v1.2.1

Unfortunately I've been able to reproduce this issue also after updating to:

node v5.10.1
npm 3.8.6
yo 1.7.0

Arrow keys do not work, at least under Windows 10 Pro 64 bit.

The surprising fact is that the behavior is quite random. 5 out of 10 times
the arrow keys do not work; the cursor gets stuck on the readline and you
can type random text. Additionally, during menu navigation the behavior can
change from working to not working and vice versa. In short: it looks like
every time a readline is executed the dice are rolled.

node --version
5.10.1

npm --version
3.8.6

yo --version
1.7.0

ver
Microsoft Windows Version 10.0.10586

2016-04-10 17:40 GMT+02:00 Salvatore Meschini [email protected]:

Unfortunately I've been able to reproduce this issue also after updating
to:

node v5.10.1
npm 3.8.6
yo 1.7.0

Arrow keys do not work, at least under Windows 10 Pro 64 bit.

—
You are receiving this because you are subscribed to this thread.
Reply to this email directly or view it on GitHub
https://github.com/nodejs/node/issues/5384#issuecomment-208005109

Same problem here (W10 x64) with :
node --version v5.10.1
npm --version 3.8.3
yo --version 1.7.0
bower --version 1.7.9

I tried to reinstall yo, keypress worked, but after relaunch it doesn't work anymore ...
Any fix planned ?

You should read the comments on this issue, then you will see that the problem is mostly identified and @orangemocha has already done a fix (reverted because it breaks something else). Then it's not trivial.

Same issue here https://github.com/nodejs/node/issues/5940
Inquirer.js
Yeoman
npm-check
They do not work by this issue

Same issue with Node.js 5.11.0 under Windows 10 (64-bit), arrow keys do not work.

Maybe we're looking at it wrong. Maybe it has something to do with Windows and not node.

Maybe u are right @dsouzadyn I've tested inquirer.js with cmder and it works normally.
image

image

@dsouzadyn @sant123 Last time I checked node 4 (before upgrading to 6, about 2 days ago) everything was working fine, so this is a plain old regression.

@fatfisz what I read was this bug was fixed and then appeared again; so if it was working on those version with cmd, powershell on windows, it should be working again. Meanwhile I'm going to continue using cmder.

Been working on this and I think we have the issue fully understood. We are working on a fix in libuv.

@orangemocha could you link to the libuv issue if there is one?

Note: This issue is going to bother a wide range of NodeSchool attendees. It would be nice to know how to detect versions of Node.js that are not suited to work interactively (to show a useful error message)

Would it be possible to produce builds of some sort (unofficial?) for the time being that revert the c-ares upgrade? (Is that even a good idea?)

The c-ares update just changed the timing of node startup but the race condition has always been there. We are working on a fix in libuv. I am hoping we can open a pull request there early next week. As a workaround, arrow keys should work on Windows 7.

Also, if there is a v5.x build with my previous fix that got reverted, that would work as a workaround.

hey, how about we just do this on 5.x, I was chatting with @jasnell about using old stable lines for a bit of experimentation and while we want to keep them relatively stable I think they are an opportunity to mess with things a _little_ and this is a perfect case. @orangemocha would you mind putting a PR against 5.x that reverts c-ares or whatever we need to revert to get this working and then we could get folks to use latest 5.x rather than 6.x if they need to have this working.

+1 definitely support this approach
On May 9, 2016 5:02 PM, "Rod Vagg" [email protected] wrote:

hey, how about we just do this on 5.x, I was chatting with @jasnell
https://github.com/jasnell about using old stable lines for a bit of
experimentation and while we want to keep them relatively stable I think
they are an opportunity to mess with things a _little_ and this is a
perfect case. @orangemocha https://github.com/orangemocha would you
mind putting a PR against 5.x that reverts c-ares or whatever we need to
revert to get this working and then we could get folks to use latest 5.x
rather than 6.x if they need to have this working.

—
You are receiving this because you were mentioned.
Reply to this email directly or view it on GitHub
https://github.com/nodejs/node/issues/5384#issuecomment-218024683

We should have the proper libuv fix ready today. We can float it in 5.x, and perhaps we should do the same in master and v6.x.

The c-ares update just changed the timing of things but the race condition was always there, so reverting it might fix the issue for some but probably not for all.

Sounds like it should land in master regardless? Maybe we'll only release it in v5.x for s a couple weeks first though?

@orangemocha it is worth mentioning that the libuv project has not been super keen on seeing floating patches on node, in fact there has never been one afaik

https://github.com/nodejs/node/pull/6392#issuecomment-214662001 for reference

I think it should land on master regardless. By the way, it looks like we'll need one more day....

@orangemocha it is worth mentioning that the libuv project has not been super keen on seeing floating patches on node, in fact there has never been one afaik

@saghul this issue has quite a severe impact on Node users on Windows. Do you see any problems with us backporting the fix (once reviewed in libuv)?

We have a few fixes already on v1.x, I can do a release and the
corresponding PR to Node as soon as the fix lands. Would that work for
everyone? I'd really want to avoid floating patches.
On May 10, 2016 19:28, "Alexis Campailla" [email protected] wrote:

I think it should land on master regardless. By the way, it looks like
we'll need one more day....

@orangemocha https://github.com/orangemocha it is worth mentioning that
the libuv project has not been super keen on seeing floating patches on
node, in fact there has never been one afaik

@saghul https://github.com/saghul this issue has quite a severe impact on
Node users on Windows. Do you see any problems with us backporting the fix
(once reviewed in libuv)?

—
You are receiving this because you were mentioned.
Reply to this email directly or view it on GitHub
https://github.com/nodejs/node/issues/5384#issuecomment-218230212

@saghul would you be willing to do an LTS patch release to 1.8.x with a few cherry picks? We never updated v5 to 1.9.x

edit: I'd be willing to help with building / testing this

Would that work for everyone? I'd really want to avoid floating patches.

Fine by me, if you are prepared to treat this with high priority. Thank you!

On 10/05/16 22:03, Myles Borins wrote:

@saghul https://github.com/saghul would you be willing to do an LTS
patch release to 1.8.x with a few cherry picks? We never updated v5 to 1.9.x

Not really. It would have to be 1.9.x, which is ABI and API compatible
with 1.x, which I guess should make it LTS-friendly.

Saúl Ibarra Corretgé
http://bettercallsaghul.com

If that is the case I'd almost opt for not backporting to v5 at all... /cc @nodejs/lts

edit: some reasoning. While we know there are experience breaking bugs without this update there are quite a few edge cases we are still chasing down with the change to v1.9.0

This includes changes to the behavior of process.exit() and potentially some changes to how require works

I think that we will open more support issues by updating.

We should definitely update v6 asap though, this will be a huge improvement for windows users.

sounds reasonable, as long as we get 6.x fixed I'm easy

A fix is pending in libuv: https://github.com/libuv/libuv/pull/866

The issue is fixed with v6.2.0.

Good job!
Thanks everyone for help.

Just tested it. Works like charm. ¡Great Job!

Does anyone have a little library that tells me when the running Node.JS is known to fail in interactive mode?

@martinheidegger the race condition was always latent (until the fix in 6.2.0) but was apparently exacerbated by the c-ares update (https://github.com/nodejs/node/commit/f1a082741743f51112bbf14d4d300f30e99d54d3).

I would consider versions >= 5.6.0 and < 6.2.0 to be prone to this issue.

@orangemocha I also remember that version < 4.2 and < 5.3 broke through a previous issue.

Yes, I believe the race condition has always been there. Various commits affected the timing of Node startup and so the likelihood of this issue reproducing has varied with different versions. It should also be noted that the problem only affects Windows 8 and higher.

We should consider backporting a fix to v4. I'll also try to think of a userland workaround, though I am not sure one is possible.

@orangemocha (and others) see https://github.com/nodejs/node/issues/6806 Hopefully we will have the patched libuv in Node v4 once we sort out the stdout/err shenanigans.

@martinheidegger the following SemVer range should hopefully match any

  • known-broken: >=4.0 <4.2 || >=5.0 <5.3 || >=5.6 <6.2
  • known fixed: <4 || >=4.2 <5 || >=5.3 < 6 || >= 6.2

I did consider a runtime feature-detection approach, but some of the earlier reports made it seem non-deterministic.

I'll put together a package to just do a SemVer comparison against the node version unless someone else beats me to it. :)

@martinheidegger here we go:

Feel free to let me know if it has any false-positives / false-negatives for you. :)

Keypress doesn't work in Git Bash also. But it works fine with ConEmu64 (https://conemu.github.io/) for me on Windows 10.

Was this page helpful?
0 / 5 - 0 ratings

Related issues

stevenvachon picture stevenvachon  Â·  3Comments

sandeepks1 picture sandeepks1  Â·  3Comments

seishun picture seishun  Â·  3Comments

filipesilvaa picture filipesilvaa  Â·  3Comments

addaleax picture addaleax  Â·  3Comments