Tested under Ubuntu 16.04.
When running:
time ls -lR /usr
Hyperterm locks up to 1 frame every few seconds and crashes (turns all black then all white) shortly afterwards.

By comparison, gnome-terminal runs in:
real 0m6.967s
user 0m1.372s
sys 0m2.444s
This is almost certainly an issue with hterm and a use case they haven't optimized around. As an aside, we've worked a bit to get this usable on xterm.js which runs it in around 10 seconds but there are still improvements to be made.
@Tyriar, is this still the case?
@CodeTheory yes 😞
I don't know if I'm having the same issue.
With commands that generate a lot of output, Hyper will stop responding.
Memory usage keeps going up, and then it just stops responding.
This command will do that:
time ls -lR /usr
This is affecting me too.
Had to switch to an alternate terminal today when this popped up. Grepping consistently crashes the program (Hyper Helper = 100% CPU usage) and makes me lose all tabs. This is a deal breaker.
I can second this. Deal breaker for me, too. Had to sadly switch back to Terminal.app today, much as I love Hyper.js.
Somehow related I assume: hangs when doing npm install of a moderate size (nothing crazy, 20-30 direct dependencies) on macOS (MBP 2017, SSD, 16GB RAM).
Eventually it gets back to normal, but until then it's completely unresponsive.
Definitely a deal breaker - I want to use hyper but it's impossible to do so when it doesn't really seem to be able to handle giant outputs/logs in a day-to-day dev environment. Hoping some more progress is made on the performance side of things.
This is happening to me too, hyper freezes all the time after I do npm installs, what could it be?
@bntzio This issue is supposed to be fixed in our canary release (using xtermjs instead of hterm ).
Can you confirm you use stable release?
macOS 10.13.1
Hyper 1.4.8
Plugins:
hyperterm-atom-dark (1.1.2),
hyperlinks (0.5.0),
hypercwd (1.1.1),
hyperterm-alternatescroll (1.3.0),
hyper-tabs-enhanced (0.4.2)
I have a directory with 266310 text files (csv data dumps) on a remote server. When connected over ssh, executing ls freezes Hyper completely. I can't access any tab or do anything other than hitting ⌘+Q loosing all tabs. I can't even open DevTools.
Native Terminal.app works just fine.
Using canary release 2.1.1 it also locks but eventually finishes displaying all output. But there are errors and performance problems, see below.
This exception is thrown right before displaying all output:

There are 20583 errors listed in console
/Users/jakub/.hyper_plugins/node_modules/hypercwd/index.js:32 Error: spawn /bin/sh EAGAIN
at exports._errnoException (util.js:1050)
at Process.ChildProcess._handle.onexit (internal/child_process.js:193)
at onErrorNT (internal/child_process.js:367)
at _combinedTickCallback (internal/process/next_tick.js:80)
at process._tickCallback (internal/process/next_tick.js:104)
DevTools performance profiling

@chabou using Hyper 1.4.8 (stable)
Has any progress been made on this? Still an issue with 1.4.8 stable. I am using zsh instead of bash if it makes any difference?
@AlienHoboken this issue should be fixed in the Canary build (v2.x).
I've just downloaded 2.0.0-canary.8 and am still having the issue.
Running time ls -lR /usr the output initially freezes. After ~30s the output jumps a couple of times, then after a bit more time it starts outputting more fluidly. It is still notably choppier than running the command in Terminal.app.
macOS 10.12.6
Hyper 2.0.0-canary.8 (stable)
Plugins:
hyper-blink (1.1.2),
hyper-statusline (1.7.4),
hyper-tabs-enhanced (0.4.2),
hyperborder (0.11.1),
hyper-search (0.0.7)
If I can provide any more information, I'd love to.
It is possible to stall the terminal still (depends largely on your hardware). The cause is that the parser runs on the main thread and is limited to parsing a maximum of 300 characters per frame https://github.com/xtermjs/xterm.js/blob/8c4e6b0e5b8188c361f7e787cca18fadb26d8d33/src/Terminal.ts#L60-L64
The fix for this would be to do this work in a web worker but that's probably a ways off (unless there is a willing contributor).
If it helps, the spec of the Mac I'm working on are as follows:
i7-4870HQ CPU @ 2.50GHz
16GB RAM
(is there anything else that would make a difference?)
Interestingly, the time ls -lR /usr command used ~200% of my CPU at its peak, and a negligible amount of RAM.
Thanks for the speedy reply! I've not ever looked into the xtermjs stuff, or really at web workers, but will have a look. I probably won't be capable or helping out though :(
I found this today when I tried using ack to search through a folder. Hyper just froze and I had to force-quit.
The same is happening to me.. Sometimes I have to run cURL commands on terminal and it returns a big output that instantly crashes Hyper.js.
@iamvinny Can you try the canary release? It certainly stress tests your systems but it does't lock up Hyper for me anymore.
@timothyis , thanks, changing the update channel to canary solved the problem.
V2 is out 🎉
Most helpful comment
Had to switch to an alternate terminal today when this popped up. Grepping consistently crashes the program (Hyper Helper = 100% CPU usage) and makes me lose all tabs. This is a deal breaker.