Hi, I'm calling a contracts balanceOf method for 100K addresses. While I receive many good responses I also get this error. I'm using "web3": "^1.0.0-beta.34". any ideas?
Error: Invalid JSON RPC response: ""
at Object.InvalidResponse (D:\eosdac\node_modules\web3-core-helpers\src\errors.js:42:16)
at XMLHttpRequest.request.onreadystatechange (D:\eosdac\node_modules\web3-providers-http\src\index.js:73:32)
at XMLHttpRequestEventTarget.dispatchEvent (D:\eosdac\node_modules\xhr2\lib\xhr2.js:64:18)
at XMLHttpRequest._setReadyState (D:\eosdac\node_modules\xhr2\lib\xhr2.js:354:12)
at XMLHttpRequest._onHttpRequestError (D:\eosdac\node_modules\xhr2\lib\xhr2.js:544:12)
at ClientRequest.<anonymous> (D:\eosdac\node_modules\xhr2\lib\xhr2.js:414:24)
at emitOne (events.js:116:13)
at ClientRequest.emit (events.js:211:7)
at TLSSocket.socketErrorListener (_http_client.js:387:9)
at emitOne (events.js:116:13)
at TLSSocket.emit (events.js:211:7)
at emitErrorNT (internal/streams/destroy.js:64:8)
at _combinedTickCallback (internal/process/next_tick.js:138:11)
at process._tickCallback (internal/process/next_tick.js:180:9)
Kasperfish, how are you calling balanceOf can you show a snippet? I'm running into similar issues
sure, I'm basically iterating over the different addresses with a map function, the temp array hold all the addresses. What are you doing and are you using the same web3 version?
let balances = temp.map(x => self.getBalance(x).then(r => [x,r]));
getBalance(t){
return this.contract.methods.balanceOf(t).call().then(function (result) {
let amount = result/Math.pow(10, 18);
console.log( t + ' -> ' + colors.bold(amount));
return amount;
}).catch(function(e){
console.log(colors.bold.red(e));
});
}
@kasperfish @PetroJames
truffle is also seeing this with large test suites that make thousands of calls. It looks like the module web3 uses in the HTTPProvider for async calls (XHR2) requests a port resource for each request and most systems will cap out when this value is large: 2^14 - 1 (16383) [Edit]
(There's a diagnosis by @barakman and extensive discussion about what might be done to address it in a thread starting here)
thank you @cgewecke this is really helpful information. I hope a core member of the web3 project can review this issue and hopefully come with a solid fix. For now I will pause my script to free up some ports or use a previous synchronous web3 version.
On parity 1.9.5 it is working. Any parity version newer than this will error out.
I found out whats wrong with my query. I am using web3 batch. Parity added a 5MB limit with this commit: https://github.com/paritytech/jsonrpc/commit/06bec6ab682ec4ce16d7e7daf2612870c4272028
And parity implemented a connection limit here: https://github.com/paritytech/jsonrpc/commit/9bbcfe6c720233fb35702ac5020d57cbc95c5208
This one is not hard coded like the 5MB limit. Please start parity with --ws-max-connections=100000 and try again.
Same. I get the error every now and then for getBlock. It works 99% of the time, but I'm constantly scanning epochs so I run into the issue a lot. Any idea on why this happens sometimes but not all the time?
@arshbot:
You can change the communication protocol of your Ethereum Node from HTTP to WebSocket.
Please note that it might introduce other problems, as Web-Sockets are more efficient (you only need one), but HTTP connections are more reliable (they do not depend on a previous state, but if you have a massive workload, then you might end up with thousands of them kept alive simultaneously).
If your Ethereum Node happens to be Parity, then you can simply replace the jsonrpc switches with ws switches.
In your client code, replace the Web3 HttpProvider with a Web3 WebsocketProvider.
If you're using web3.js, then you'll need to upgrade to version v1.0 (if you're not already there).
If you're experiencing this also on Truffle (used for testing), then you can apply a local fix (at the end of npm install), as described here. Since Truffle currently relies on web3.js v0.18.4, this fix is different (not based on WebSockets).
P.S.:
On Parity, you can keep both jsonrpc API and ws API opened (on different ports).
And in your client code, you can maintain two instances of Web3, one using an HttpProvider and the other using a WebsocketProvider.
@kasperfish We have merged the PR https://github.com/ethereum/web3.js/pull/1753 and it will be released in 1.0.0-beta.36 this should solve it. Please drop a comment if this issue is still there in the next version.
@nivida I am observing what seems to be a similar issue with a later version, opened in #3370. It seems to have the same or very similar stack trace and does not require a large volume of transactions (though I also recall having previously encountered the issue with a large transaction volume).
Most helpful comment
@kasperfish @PetroJames
truffleis also seeing this with large test suites that make thousands of calls. It looks like the module web3 uses in the HTTPProvider for async calls (XHR2) requests a port resource for each request and most systems will cap out when this value is large: 2^14 - 1 (16383) [Edit](There's a diagnosis by @barakman and extensive discussion about what might be done to address it in a thread starting here)