For every key press input is sent via websocket and the response is received via websocket.
But due to network latency between client and server there is a lag in printing the output for every key press. We cannot write data in client before the response is received since the data to be written should be decided by the server. For example we cannot print passwords.
My question is how can we overcome network latency time lags. Any ideas on it?
It is really hard to reduce the input lag, because as you noticed, the input has to make the whole round-trip over the network before it can be finally rendered in the terminal.
Smart people at cloud9 have tried to work around this in the cloud9 terminal by rendering the input characters to the terminal just as you type, and if the echo that comes back does not reflect the predicted input character, they will patch the buffer.
https://github.com/c9/core/blob/master/plugins/c9.ide.terminal/predict_echo.js
This could be build as an addon maybe, but I don't think any of us has resources for this at the moment.
This is a really interesting scenario.
This could be some kind of "smart attach" add-on I guess, but as @mofux pointed out there is not much availability to implement this right now. Especially for the next couple of months, since we are preparing for xterm.js 3.0.
Help and external contributions are always welcome and appreciated though 馃槃.
Not sure it would be wise to move on something like this until we have greater knowledge about what is going on within the terminal at any given time. Eventually I'd like to know whether command output is currently streaming, whether we're in the prompt, etc. which would help with something like this.
A view model is also needed for a number of things like improved accessibility and reflow and it could also help manage something like this.
Could xterm potentially also come to support WebRTC's RTCDataChannel for transport which in theory should outperform websockets. Mosh like logic would need to be implemented to keep server and the client in sync but this could potentially become a truly powerful solution for slow/low bandwidth scenarios.
@exsilium xterm.js doesn't touch anything in the area, except for the attach addon does, but that's just a simple example to show the necessary 2-way communication for a server/client setup.
I don't think we can really action this, if anything it should be at the addon level which will hopefully have a better story soon for community driven projects https://github.com/xtermjs/xterm.js/issues/1128#issuecomment-394142177
It seems someone created an addon for this purpose 馃檪 https://github.com/wavesoft/local-echo
edit: I may have misjudged the purpose of the above addon. It seems it's not doing the predictive local echo that mosh does.
Most helpful comment
Not sure it would be wise to move on something like this until we have greater knowledge about what is going on within the terminal at any given time. Eventually I'd like to know whether command output is currently streaming, whether we're in the prompt, etc. which would help with something like this.
A view model is also needed for a number of things like improved accessibility and reflow and it could also help manage something like this.