On a Ubuntu Linux environment (Trusty), tests randomly fail with this ExtendableError:
1) Contract: TopicEvent "after each" hook for "throws on an invalid result index":
Error: Could not connect to your Ethereum client. Please check that your Ethereum client:
- is running
- is accepting RPC connections (i.e., "--rpc" option is used in geth)
- is accessible over the network
- is properly configured in your Truffle configuration file (truffle.js)
at ProviderError.ExtendableError (/home/travis/.nvm/versions/node/v8.9.3/lib/node_modules/truffle/build/cli.bundled.js:9401:17)
at new ProviderError (/home/travis/.nvm/versions/node/v8.9.3/lib/node_modules/truffle/build/cli.bundled.js:325054:24)
at /home/travis/.nvm/versions/node/v8.9.3/lib/node_modules/truffle/build/cli.bundled.js:325137:17
at /home/travis/.nvm/versions/node/v8.9.3/lib/node_modules/truffle/build/cli.bundled.js:325195:24
at XMLHttpRequest.request.onreadystatechange (/home/travis/.nvm/versions/node/v8.9.3/lib/node_modules/truffle/build/cli.bundled.js:328229:7)
at XMLHttpRequestEventTarget.dispatchEvent (/home/travis/.nvm/versions/node/v8.9.3/lib/node_modules/truffle/build/cli.bundled.js:176415:18)
at XMLHttpRequest._setReadyState (/home/travis/.nvm/versions/node/v8.9.3/lib/node_modules/truffle/build/cli.bundled.js:176705:12)
at XMLHttpRequest._onHttpRequestError (/home/travis/.nvm/versions/node/v8.9.3/lib/node_modules/truffle/build/cli.bundled.js:176895:12)
at ClientRequest.<anonymous> (/home/travis/.nvm/versions/node/v8.9.3/lib/node_modules/truffle/build/cli.bundled.js:176765:24)
at Socket.socketOnEnd (_http_client.js:423:9)
at endReadableNT (_stream_readable.js:1056:12)
at _combinedTickCallback (internal/process/next_tick.js:138:11)
Specifically, I have a Travis-CI (continuous integration) setup and this is where the tests are failing. My local Mac OSX environment passes these tests with no problem. Every once and a while, they will fail with the same error, but I just run the tests again and they pass.
I'd say it happens like 10-15% of the time on Mac OSX, but it happens like 60-80% of the time on the Travis-CI linux env.
It feels like it used to have this error less on earlier Truffle versions. I just updated to 4.0.4 and it seems way more often now.
Tests should pass like they do on Mac OSX env.
I test this on my local machine (mac osx), when all tests pass which they do, I push up to Github. Then it fires off a Travis-CI test on the linux env and fails pretty much every time.
Travis-CI Env (fails)
Mac OSX Env (passes)
$ gcc --version:
Configured with: --prefix=/Applications/Xcode.app/Contents/Developer/usr --with-gxx-include-dir=/usr/include/c++/4.2.1
Apple LLVM version 9.0.0 (clang-900.0.39.2)
Target: x86_64-apple-darwin16.7.0
Thread model: posix
InstalledDir: /Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/bin
This happens to me as well:
PS D:\Projects\Fun Stuff\daostack> truffle develop
Truffle Develop started at http://localhost:9545/
Accounts:
(0) 0x627306090abab3a6e1400e9345bc60c78a8bef57
(1) 0xf17f52151ebef6c7334fad080c5704d77216b732
(2) 0xc5fdf4076b8f3a5357c5e395ab970b5b54098fef
(3) 0x821aea9a577a9b44299b9c15c88cf3087f3b5544
(4) 0x0d1d4e623d10f9fba5db95830f7d3839406c6af2
(5) 0x2932b7a2355d6fecc4b5c0b6bd44cc31df247a2e
(6) 0x2191ef87e392377ec08e7c08eb105ef5448eced5
(7) 0x0f4f2ac550a1b4e2280d04c21cea7ebd822934b5
(8) 0x6330a553fc93768f612722bb8c2ec78ac90b3bbc
(9) 0x5aeda56215b167893e80b4fe645ba6d5bab767de
Private Keys:
(0) c87509a1c067bbde78beb793e6fa76530b6382a4c0241e5e4a9ec0a0f44dc0d3
(1) ae6ae8e5ccbfb04590405997ee2d52d2b330726137b875053c36d94e974d162f
(2) 0dbbe8e4ae425a6d2687f1a7e3ba17bc98c673636790f1b8ad91193c05875ef1
(3) c88b703fb08cbea894b6aeff5a544fb92e78a18e19814cd85da83b71f772aa6c
(4) 388c684f0ba1ef5017716adb5d21a053ea8e90277d0868337519f97bede61418
(5) 659cbb0e2411a44db63778987b1e22153c086a95eb6b18bdf89de078917abc63
(6) 82d052c865f5763aad42add438569276c00d3d88a2d062d36b2bae914d58b8c8
(7) aa3680d5d48a8283413f7a108367c7299ca73f553735860a87b08f39395618b7
(8) 0f62d96d6675f32685bbdb8ac13cda7c23436f63efbb9d07700d8669ff12b7c4
(9) 8d5366123cb560bb606379f90a0bfd4769eecc0557f1b362dcae9012b548b1e5
Mnemonic: candy maple cake sugar pudding cream honey rich smooth crumble sweet treat
truffle(develop)> test
Compiling .\contracts\Migrations.sol...
Compiling .\contracts\VotingMachines\AbsoluteVote.sol...
Compiling .\contracts\VotingMachines\EmergentVoteScheme.sol...
Compiling .\contracts\VotingMachines\IntVoteInterface.sol...
Compiling .\contracts\VotingMachines\QuorumVote.sol...
Compiling .\contracts\controller\Avatar.sol...
Compiling .\contracts\controller\Controller.sol...
Compiling .\contracts\controller\DAOToken.sol...
Compiling .\contracts\controller\Reputation.sol...
Compiling .\contracts\globalConstraints\GlobalConstraintInterface.sol...
Compiling .\contracts\globalConstraints\TokenCapGC.sol...
Compiling .\contracts\test\ActionMock.sol...
Compiling .\contracts\test\Debug.sol...
Compiling .\contracts\test\ExecutableTest.sol...
Compiling .\contracts\test\GlobalConstraintMock.sol...
Compiling .\contracts\universalSchemes\ContributionReward.sol...
Compiling .\contracts\universalSchemes\ExecutableInterface.sol...
Compiling .\contracts\universalSchemes\GenesisScheme.sol...
Compiling .\contracts\universalSchemes\GlobalConstraintRegistrar.sol...
Compiling .\contracts\universalSchemes\OrganizationRegister.sol...
Compiling .\contracts\universalSchemes\SchemeRegistrar.sol...
Compiling .\contracts\universalSchemes\SimpleICO.sol...
Compiling .\contracts\universalSchemes\UniversalScheme.sol...
Compiling .\contracts\universalSchemes\UniversalSchemeInterface.sol...
Compiling .\contracts\universalSchemes\UpgradeScheme.sol...
Compiling .\contracts\universalSchemes\VestingScheme.sol...
Compiling .\contracts\universalSchemes\VoteInOrganizationScheme.sol...
Compiling zeppelin-solidity/contracts/lifecycle/Destructible.sol...
Compiling zeppelin-solidity/contracts/math/SafeMath.sol...
Compiling zeppelin-solidity/contracts/mocks/StandardTokenMock.sol...
Compiling zeppelin-solidity/contracts/ownership/Ownable.sol...
Compiling zeppelin-solidity/contracts/token/BurnableToken.sol...
Compiling zeppelin-solidity/contracts/token/MintableToken.sol...
Compiling zeppelin-solidity/contracts/token/StandardToken.sol...
Compiling zeppelin-solidity\contracts\math\SafeMath.sol...
Compiling zeppelin-solidity\contracts\ownership\Ownable.sol...
Compiling zeppelin-solidity\contracts\token\BasicToken.sol...
Compiling zeppelin-solidity\contracts\token\ERC20.sol...
Compiling zeppelin-solidity\contracts\token\ERC20Basic.sol...
Compiling zeppelin-solidity\contracts\token\StandardToken.sol...
Contract: AbsoluteVote
โ Sanity checks (3485ms)
โ log the LogNewProposal event on proposing new proposal (1565ms)
โ should log the LogCancelProposal event on canceling a proposal (982ms)
โ should log the LogVoteProposal and LogCancelVoting events on voting and canceling the vote (1029ms)
โ should log the LogExecuteProposal event (1395ms)
โ All options can be voted (0-9) (2639ms)
โ Double vote shouldn't double proposal's 'Option 2' count (1205ms)
โ Vote cancellation should revert proposal's counters (1117ms)
โ As allowOwner is set to true, Vote on the behalf of someone else should work (1014ms)
โ As allowOwner is set to false, Vote on the behalf of someone else should NOT work (938ms)
โ if the voter is not the proposal's owner, he shouldn't be able to vote on the behalf of someone else (892ms)
โ Non-existent parameters hash shouldn't work (1085ms)
โ Invalid percentage required( < 0 || > 100) shouldn't work (2029ms)
โ Proposal voting or cancelling shouldn't be able after proposal has been executed (1180ms)
โ the vote function should behave as expected (1976ms)
โ cannot vote for another user (866ms)
โ Should behave sensibly when voting with an empty reputation system (333ms)
โ Should behave sensibly without an executable [TODO] execution isn't implemented yet (483ms)
โ Proposal with wrong num of options (876ms)
โ Test voteWithSpecifiedAmounts - More reputation than I own, negative reputation, etc.. (1139ms)
โ Internal functions can not be called externally (963ms)
โ Try to send wrong proposal id to the voting/cancel functions (1063ms)
โ 2 proposals, 1 Reputation system (1703ms)
as _not_ proposal owner - vote for myself
โ vote "Option 1" then vote "Option 2" should register "Option 2" (1139ms)
โ vote "Option 3" then vote "Option 4" should register "Option 4" (1206ms)
as proposal owner - vote for another user
โ vote "Option 1" then vote "Option 2" should register "Option 2" (1163ms)
โ vote "Option 3" then vote "Option 4" should register "Option 4" (1015ms)
Contract: Avatar
โ genericAction no owner (466ms)
โ generic call (553ms)
โ pay ether to avatar (1282ms)
โ sendEther from (2278ms)
โ externalTokenTransfer (592ms)
โ externalTokenTransferFrom & ExternalTokenIncreaseApproval (648ms)
โ externalTokenTransferFrom & externalTokenDecreaseApproval (784ms)
Contract: ContributionReward
โ constructor (138ms)
โ setParameters (1422ms)
โ registerOrganization - check fee payment (2066ms)
โ submitContribution log (1745ms)
โ submitContribution fees (1779ms)
โ submitContribution without registration -should fail (1371ms)
โ submitContribution check owner vote (1457ms)
โ submitContribution check beneficiary==0 (1716ms)
โ execute submitContribution yes (1778ms)
โ execute submitContribution mint reputation (1759ms)
โ execute submitContribution mint tokens (1519ms)
โ execute submitContribution send ethers (2403ms)
โ execute submitContribution send externalToken (2165ms)
โ execute submitContribution proposal decision=='no' send externalToken (1932ms)
Contract: Controller
โ mint reputation via controller (667ms)
โ mint tokens via controller (633ms)
โ register schemes (606ms)
โ register schemes - check permissions for register new scheme (24778ms)
โ register schemes - check permissions for updating existing scheme (1408ms)
โ unregister schemes (646ms)
โ unregister none registered scheme (590ms)
โ unregister schemes - check permissions unregister scheme (20398ms)
โ unregister self (753ms)
โ isSchemeRegistered (1787ms)
โ addGlobalConstraint (1804ms)
โ removeGlobalConstraint (2442ms)
โ upgrade controller (945ms)
โ upgrade controller check permission (873ms)
โ generic call (887ms)
โ sendEther (2555ms)
โ externalTokenTransfer (1030ms)
โ externalTokenTransferFrom & ExternalTokenIncreaseApproval (1196ms)
โ externalTokenTransferFrom & externalTokenDecreaseApproval (1799ms)
โ globalConstraints mintReputation add & remove (1035ms)
โ globalConstraints mintTokens add & remove (1211ms)
โ globalConstraints register schemes add & remove (1071ms)
โ globalConstraints unregister schemes add & remove (1001ms)
โ globalConstraints generic call add & remove (1551ms)
โ globalConstraints sendEther add & remove (3490ms)
โ globalConstraints externalTokenTransfer add & remove (1294ms)
โ globalConstraints externalTokenTransferFrom , externalTokenIncreaseApproval , externalTokenDecreaseApproval (2016ms)
Contract: DAOToken
โ should put 0 Coins in the first account (70ms)
โ should be owned by its creator (83ms)
โ should be destructible (87ms)
โ should mint tokens to owner account (7614ms)
โ should allow minting tokens only by owner (6067ms)
โ log the Mint event on mint (101ms)
โ mint should be reflected in totalSupply (220ms)
โ mint should be reflected in balances (110ms)
โ totalSupply is 0 on init (77ms)
โ burn (190ms)
onlyOwner
โ mint by owner (96ms)
โ mint by not owner (71ms)
Contract: GenesisScheme
โ founders should get their share in reputation and tokens (2426ms)
โ forgeOrg check avatar (383ms)
โ forgeOrg check reputations and tokens to founders (328ms)
โ forgeOrg check transfer ownership (405ms)
โ setSchemes log (391ms)
โ setSchemes from account that does not hold the lock (306ms)
โ setSchemes increase approval for scheme (413ms)
โ setSchemes increase approval for scheme without fee (346ms)
โ setSchemes check register (359ms)
โ setSchemes check unregisterSelf (466ms)
โ setSchemes delete lock (373ms)
Contract: GlobalConstraintRegistrar
โ constructor (142ms)
โ setParameters (777ms)
โ registerOrganization - check fee payment (1457ms)
โ proposeGlobalConstraint log (1580ms)
โ proposeGlobalConstraint without registration -should fail (1292ms)
โ proposeGlobalConstraint check owner vote (1269ms)
โ execute proposeGlobalConstraint (1612ms)
โ proposeToRemoveGC log (1717ms)
โ proposeToRemoveGC without registration -should fail (1189ms)
โ proposeToRemoveGC check owner vote (1829ms)
โ execute proposeToRemoveGC (2184ms)
โ execute proposeToRemoveGC (same as proposeGlobalConstraint) vote=NO (1693ms)
Contract: Migrations
โ should have deployed entire DAOStack
Contract: OrganizationRegister
โ constructor (150ms)
โ setParameters (192ms)
โ registerOrganization - check fee payment (1804ms)
โ addOrPromoteAddress add and promote (2300ms)
โ addOrPromoteAddress add without enough fee should fail (1163ms)
โ addOrPromoteAddress add without regisration -should fail (1266ms)
Contract: QuorumVote
โ Sanity checks (1451ms)
โ Quorum proposals should be executed when reaching the percentage required (1332ms)
โ Invalid inputs shouldn't work (precReq, vote) (6588ms)
โ All options can be voted (0-9) (2502ms)
โ Double vote shouldn't double proposal's 'Option 2' count (1068ms)
โ Vote cancellation should revert proposal's counters (1067ms)
โ As allowOwner is set to true, Vote on the behalf of someone else should work (2654ms)
โ As allowOwner is set to false, Vote on the behalf of someone else should NOT work (1128ms)
โ if the voter is not the proposal's owner, he shouldn't be able to vote on the behalf of someone else (1200ms)
โ Should not able to vote / cancel vote / proposal after proposal has been executed (1289ms)
โ Only the owner of the proposal can cancel it (1463ms)
โ log the LogNewProposal event on proposing new proposal (1231ms)
โ Should log the LogCancelProposal event on canceling a proposal (857ms)
โ Should log the LogVoteProposal and LogCancelVoting events on voting and canceling the vote (999ms)
โ Should log the LogExecuteProposal event on executing quorum proposal with 'no' decision (925ms)
โ cannot vote for another user (998ms)
โ Should behave sensibly without an executable [TODO] execution isn't implemented yet (458ms)
โ Test voteWithSpecifiedAmounts - More reputation than I own, negative reputation, etc.. (982ms)
โ Internal functions can not be called externally (2591ms)
โ Try to send wrong proposal id to the voting/cancel functions (1798ms)
Contract: Reputation
1) "before all" hook: prepare suite
Contract: SchemeRegistrar
2) "before all" hook: prepare suite
Contract: SimpleICO
3) "before all" hook: prepare suite
Contract: TokenCapGC
4) "before all" hook: prepare suite
Contract: UpgradeScheme
5) "before all" hook: prepare suite
Contract: VestingScheme
6) "before all" hook: prepare suite
Contract: VoteInOrganizationScheme
7) "before all" hook: prepare suite
137 passing (4m)
7 failing
1) Contract: Reputation "before all" hook: prepare suite:
Error: Could not connect to your Ethereum client. Please check that your Ethereum client:
- is running
- is accepting RPC connections (i.e., "--rpc" option is used in geth)
- is accessible over the network
- is properly configured in your Truffle configuration file (truffle.js)
at ProviderError.ExtendableError (C:\Users\Alon\AppData\Roaming\nvm\v8.9.3\node_modules\truffle\build\cli.bundled.js:9401:17)
at new ProviderError (C:\Users\Alon\AppData\Roaming\nvm\v8.9.3\node_modules\truffle\build\cli.bundled.js:325054:24)
at C:\Users\Alon\AppData\Roaming\nvm\v8.9.3\node_modules\truffle\build\cli.bundled.js:325137:17
at C:\Users\Alon\AppData\Roaming\nvm\v8.9.3\node_modules\truffle\build\cli.bundled.js:325195:24
at XMLHttpRequest.request.onreadystatechange (C:\Users\Alon\AppData\Roaming\nvm\v8.9.3\node_modules\truffle\build\cli.bundled.js:328229:7)
at XMLHttpRequestEventTarget.dispatchEvent (C:\Users\Alon\AppData\Roaming\nvm\v8.9.3\node_modules\truffle\build\cli.bundled.js:176415:18)
at XMLHttpRequest._setReadyState (C:\Users\Alon\AppData\Roaming\nvm\v8.9.3\node_modules\truffle\build\cli.bundled.js:176705:12)
at XMLHttpRequest._onHttpRequestError (C:\Users\Alon\AppData\Roaming\nvm\v8.9.3\node_modules\truffle\build\cli.bundled.js:176895:12)
at ClientRequest.<anonymous> (C:\Users\Alon\AppData\Roaming\nvm\v8.9.3\node_modules\truffle\build\cli.bundled.js:176765:24)
2) Contract: SchemeRegistrar "before all" hook: prepare suite:
Error: Could not connect to your Ethereum client. Please check that your Ethereum client:
- is running
- is accepting RPC connections (i.e., "--rpc" option is used in geth)
- is accessible over the network
- is properly configured in your Truffle configuration file (truffle.js)
at ProviderError.ExtendableError (C:\Users\Alon\AppData\Roaming\nvm\v8.9.3\node_modules\truffle\build\cli.bundled.js:9401:17)
at new ProviderError (C:\Users\Alon\AppData\Roaming\nvm\v8.9.3\node_modules\truffle\build\cli.bundled.js:325054:24)
at C:\Users\Alon\AppData\Roaming\nvm\v8.9.3\node_modules\truffle\build\cli.bundled.js:325137:17
at C:\Users\Alon\AppData\Roaming\nvm\v8.9.3\node_modules\truffle\build\cli.bundled.js:325195:24
at XMLHttpRequest.request.onreadystatechange (C:\Users\Alon\AppData\Roaming\nvm\v8.9.3\node_modules\truffle\build\cli.bundled.js:328229:7)
at XMLHttpRequestEventTarget.dispatchEvent (C:\Users\Alon\AppData\Roaming\nvm\v8.9.3\node_modules\truffle\build\cli.bundled.js:176415:18)
at XMLHttpRequest._setReadyState (C:\Users\Alon\AppData\Roaming\nvm\v8.9.3\node_modules\truffle\build\cli.bundled.js:176705:12)
at XMLHttpRequest._onHttpRequestError (C:\Users\Alon\AppData\Roaming\nvm\v8.9.3\node_modules\truffle\build\cli.bundled.js:176895:12)
at ClientRequest.<anonymous> (C:\Users\Alon\AppData\Roaming\nvm\v8.9.3\node_modules\truffle\build\cli.bundled.js:176765:24)
3) Contract: SimpleICO "before all" hook: prepare suite:
Error: Could not connect to your Ethereum client. Please check that your Ethereum client:
- is running
- is accepting RPC connections (i.e., "--rpc" option is used in geth)
- is accessible over the network
- is properly configured in your Truffle configuration file (truffle.js)
at ProviderError.ExtendableError (C:\Users\Alon\AppData\Roaming\nvm\v8.9.3\node_modules\truffle\build\cli.bundled.js:9401:17)
at new ProviderError (C:\Users\Alon\AppData\Roaming\nvm\v8.9.3\node_modules\truffle\build\cli.bundled.js:325054:24)
at C:\Users\Alon\AppData\Roaming\nvm\v8.9.3\node_modules\truffle\build\cli.bundled.js:325137:17
at C:\Users\Alon\AppData\Roaming\nvm\v8.9.3\node_modules\truffle\build\cli.bundled.js:325195:24
at XMLHttpRequest.request.onreadystatechange (C:\Users\Alon\AppData\Roaming\nvm\v8.9.3\node_modules\truffle\build\cli.bundled.js:328229:7)
at XMLHttpRequestEventTarget.dispatchEvent (C:\Users\Alon\AppData\Roaming\nvm\v8.9.3\node_modules\truffle\build\cli.bundled.js:176415:18)
at XMLHttpRequest._setReadyState (C:\Users\Alon\AppData\Roaming\nvm\v8.9.3\node_modules\truffle\build\cli.bundled.js:176705:12)
at XMLHttpRequest._onHttpRequestError (C:\Users\Alon\AppData\Roaming\nvm\v8.9.3\node_modules\truffle\build\cli.bundled.js:176895:12)
at ClientRequest.<anonymous> (C:\Users\Alon\AppData\Roaming\nvm\v8.9.3\node_modules\truffle\build\cli.bundled.js:176765:24)
4) Contract: TokenCapGC "before all" hook: prepare suite:
Error: Could not connect to your Ethereum client. Please check that your Ethereum client:
- is running
- is accepting RPC connections (i.e., "--rpc" option is used in geth)
- is accessible over the network
- is properly configured in your Truffle configuration file (truffle.js)
at ProviderError.ExtendableError (C:\Users\Alon\AppData\Roaming\nvm\v8.9.3\node_modules\truffle\build\cli.bundled.js:9401:17)
at new ProviderError (C:\Users\Alon\AppData\Roaming\nvm\v8.9.3\node_modules\truffle\build\cli.bundled.js:325054:24)
at C:\Users\Alon\AppData\Roaming\nvm\v8.9.3\node_modules\truffle\build\cli.bundled.js:325137:17
at C:\Users\Alon\AppData\Roaming\nvm\v8.9.3\node_modules\truffle\build\cli.bundled.js:325195:24
at XMLHttpRequest.request.onreadystatechange (C:\Users\Alon\AppData\Roaming\nvm\v8.9.3\node_modules\truffle\build\cli.bundled.js:328229:7)
at XMLHttpRequestEventTarget.dispatchEvent (C:\Users\Alon\AppData\Roaming\nvm\v8.9.3\node_modules\truffle\build\cli.bundled.js:176415:18)
at XMLHttpRequest._setReadyState (C:\Users\Alon\AppData\Roaming\nvm\v8.9.3\node_modules\truffle\build\cli.bundled.js:176705:12)
at XMLHttpRequest._onHttpRequestError (C:\Users\Alon\AppData\Roaming\nvm\v8.9.3\node_modules\truffle\build\cli.bundled.js:176895:12)
at ClientRequest.<anonymous> (C:\Users\Alon\AppData\Roaming\nvm\v8.9.3\node_modules\truffle\build\cli.bundled.js:176765:24)
5) Contract: UpgradeScheme "before all" hook: prepare suite:
Error: Could not connect to your Ethereum client. Please check that your Ethereum client:
- is running
- is accepting RPC connections (i.e., "--rpc" option is used in geth)
- is accessible over the network
- is properly configured in your Truffle configuration file (truffle.js)
at ProviderError.ExtendableError (C:\Users\Alon\AppData\Roaming\nvm\v8.9.3\node_modules\truffle\build\cli.bundled.js:9401:17)
at new ProviderError (C:\Users\Alon\AppData\Roaming\nvm\v8.9.3\node_modules\truffle\build\cli.bundled.js:325054:24)
at C:\Users\Alon\AppData\Roaming\nvm\v8.9.3\node_modules\truffle\build\cli.bundled.js:325137:17
at C:\Users\Alon\AppData\Roaming\nvm\v8.9.3\node_modules\truffle\build\cli.bundled.js:325195:24
at XMLHttpRequest.request.onreadystatechange (C:\Users\Alon\AppData\Roaming\nvm\v8.9.3\node_modules\truffle\build\cli.bundled.js:328229:7)
at XMLHttpRequestEventTarget.dispatchEvent (C:\Users\Alon\AppData\Roaming\nvm\v8.9.3\node_modules\truffle\build\cli.bundled.js:176415:18)
at XMLHttpRequest._setReadyState (C:\Users\Alon\AppData\Roaming\nvm\v8.9.3\node_modules\truffle\build\cli.bundled.js:176705:12)
at XMLHttpRequest._onHttpRequestError (C:\Users\Alon\AppData\Roaming\nvm\v8.9.3\node_modules\truffle\build\cli.bundled.js:176895:12)
at ClientRequest.<anonymous> (C:\Users\Alon\AppData\Roaming\nvm\v8.9.3\node_modules\truffle\build\cli.bundled.js:176765:24)
6) Contract: VestingScheme "before all" hook: prepare suite:
Error: Could not connect to your Ethereum client. Please check that your Ethereum client:
- is running
- is accepting RPC connections (i.e., "--rpc" option is used in geth)
- is accessible over the network
- is properly configured in your Truffle configuration file (truffle.js)
at ProviderError.ExtendableError (C:\Users\Alon\AppData\Roaming\nvm\v8.9.3\node_modules\truffle\build\cli.bundled.js:9401:17)
at new ProviderError (C:\Users\Alon\AppData\Roaming\nvm\v8.9.3\node_modules\truffle\build\cli.bundled.js:325054:24)
at C:\Users\Alon\AppData\Roaming\nvm\v8.9.3\node_modules\truffle\build\cli.bundled.js:325137:17
at C:\Users\Alon\AppData\Roaming\nvm\v8.9.3\node_modules\truffle\build\cli.bundled.js:325195:24
at XMLHttpRequest.request.onreadystatechange (C:\Users\Alon\AppData\Roaming\nvm\v8.9.3\node_modules\truffle\build\cli.bundled.js:328229:7)
at XMLHttpRequestEventTarget.dispatchEvent (C:\Users\Alon\AppData\Roaming\nvm\v8.9.3\node_modules\truffle\build\cli.bundled.js:176415:18)
at XMLHttpRequest._setReadyState (C:\Users\Alon\AppData\Roaming\nvm\v8.9.3\node_modules\truffle\build\cli.bundled.js:176705:12)
at XMLHttpRequest._onHttpRequestError (C:\Users\Alon\AppData\Roaming\nvm\v8.9.3\node_modules\truffle\build\cli.bundled.js:176895:12)
at ClientRequest.<anonymous> (C:\Users\Alon\AppData\Roaming\nvm\v8.9.3\node_modules\truffle\build\cli.bundled.js:176765:24)
7) Contract: VoteInOrganizationScheme "before all" hook: prepare suite:
Error: Could not connect to your Ethereum client. Please check that your Ethereum client:
- is running
- is accepting RPC connections (i.e., "--rpc" option is used in geth)
- is accessible over the network
- is properly configured in your Truffle configuration file (truffle.js)
at ProviderError.ExtendableError (C:\Users\Alon\AppData\Roaming\nvm\v8.9.3\node_modules\truffle\build\cli.bundled.js:9401:17)
at new ProviderError (C:\Users\Alon\AppData\Roaming\nvm\v8.9.3\node_modules\truffle\build\cli.bundled.js:325054:24)
at C:\Users\Alon\AppData\Roaming\nvm\v8.9.3\node_modules\truffle\build\cli.bundled.js:325137:17
at C:\Users\Alon\AppData\Roaming\nvm\v8.9.3\node_modules\truffle\build\cli.bundled.js:325195:24
at XMLHttpRequest.request.onreadystatechange (C:\Users\Alon\AppData\Roaming\nvm\v8.9.3\node_modules\truffle\build\cli.bundled.js:328229:7)
at XMLHttpRequestEventTarget.dispatchEvent (C:\Users\Alon\AppData\Roaming\nvm\v8.9.3\node_modules\truffle\build\cli.bundled.js:176415:18)
at XMLHttpRequest._setReadyState (C:\Users\Alon\AppData\Roaming\nvm\v8.9.3\node_modules\truffle\build\cli.bundled.js:176705:12)
at XMLHttpRequest._onHttpRequestError (C:\Users\Alon\AppData\Roaming\nvm\v8.9.3\node_modules\truffle\build\cli.bundled.js:176895:12)
at ClientRequest.<anonymous> (C:\Users\Alon\AppData\Roaming\nvm\v8.9.3\node_modules\truffle\build\cli.bundled.js:176765:24)
Repo: https://github.com/daostack/daostack
I am running Windows 10 so this doesn't seem related to OS
For me it happens around 50% of the time locally and also happens during Travis build.
This issue has been open for a while and looks a lot like the non-deterministic ganache bug which @benjamincburns fixed recently. Downloading the latest ganache-cli (read the release notes!) should resolve this problem.
Closing but if anyone continues to see this error please re-open or comment. Thanks for reporting.
I'm getting this error message when connecting to Rinkeby with Infura, truffle-wallet-provider, and ethereumjs-wallet. I'm not convinced this is Ganache, but maybe my error is caused my something else.
@nickjm Could you provide context (what are you doing in your code) or a reproduction path? A stacktrace might also be helpful.
I have this issue around 10% of my builds, with 128 tests active, randomly.
The tests are running on the default dummy node (Ganache?), with the latest version of Truffle.
Interestingly enough, it only fails when built in our CI (Drone, dockerized), not locally.
Our environments are based on Ubuntu and use standard docker NodeJS:Carbon images.
Same problem on Windows 10:
1) Contract: ...
AssertionError: Could not connect to your Ethereum client. Please check that your Ethereum client:
- is running
- is accepting RPC connections (i.e., "--rpc" option is used in geth)
- is accessible over the network
- is properly configured in your Truffle configuration file (truffle.js)
at Context.it (...)
at <anonymous>
at process._tickCallback (internal/process/next_tick.js:188:7)
2) Contract: ... "after each" hook: after test for "...":
Error: Could not connect to your Ethereum client. Please check that your Ethereum client:
- is running
- is accepting RPC connections (i.e., "--rpc" option is used in geth)
- is accessible over the network
- is properly configured in your Truffle configuration file (truffle.js)
at ProviderError.ExtendableError (C:\Users\...\AppData\Roaming\npm\node_modules\truffle\build\webpack:\~\truffle-error\index.js:10:1)
at new ProviderError (C:\Users\...\AppData\Roaming\npm\node_modules\truffle\build\webpack:\~\truffle-provider\error.js:17:1)
at C:\Users\...\AppData\Roaming\npm\node_modules\truffle\build\webpack:\~\truffle-provider\wrapper.js:71:1
at C:\Users\...\AppData\Roaming\npm\node_modules\truffle\build\webpack:\~\truffle-provider\wrapper.js:129:1
at XMLHttpRequest.request.onreadystatechange (C:\Users\...\AppData\Roaming\npm\node_modules\truffle\build\webpack:\~\web3\lib\web3\httpprovider.js:128:1)
at XMLHttpRequestEventTarget.dispatchEvent (C:\Users\...\AppData\Roaming\npm\node_modules\truffle\build\webpack:\~\xhr2\lib\xhr2.js:64:1)
at XMLHttpRequest._setReadyState (C:\Users\...\AppData\Roaming\npm\node_modules\truffle\build\webpack:\~\xhr2\lib\xhr2.js:354:1)
at XMLHttpRequest._onHttpRequestError (C:\Users\...\AppData\Roaming\npm\node_modules\truffle\build\webpack:\~\xhr2\lib\xhr2.js:544:1)
at ClientRequest.<anonymous> (C:\Users\...\AppData\Roaming\npm\node_modules\truffle\build\webpack:\~\xhr2\lib\xhr2.js:414:1)
at Socket.socketErrorListener (_http_client.js:387:9)
at emitErrorNT (internal/streams/destroy.js:64:8)
at _combinedTickCallback (internal/process/next_tick.js:138:11)
at process._tickCallback (internal/process/next_tick.js:180:9)
@cgewecke:
Any news on this?
@barakman No, everyone's seeing it intermittently, AFAIK it predates recent work to stabilize the test client and it's possible that it's related to issue 453 at ganache-cli which is under bounty and currently being worked on. There's more detail over there if you're interested.
Unfortunately this looks like (as @benjamincburns would say) a heisenbug. If anyone finds a consistent way of reproducing it they will be greeted with delight.
@cgewecke:
Thank you for the info.
P.S.: To me, heisenbug is kind of instinctively associated with Heisenberg (Uncertainty Principle's Heisenberg, not Breaking Bad's Heisenberg). More specifically, something like - "if you touch it, you change it", or in other words - this bug is unsolvable by definition, which I sincerely hope is not the case here... Or perhaps you were referring to the bug being uncertain, not the actual solution (and perhaps I went slightly off track with my "associativity")...
In my case, I see this occurring at random indeed, but - and this is actually very deterministic - in only 1 out of 23 scripts which truffle test executes.
I've been banging my head for a while, trying to figure out in what sense this script is different from all the others, which could explain this. But I have failed to find any notable difference, so I cannot think of any sustainable workaround.
Thanks
@barakman Ahhh!! That's pretty good isolation of the problem. It could be something in the script you identified. Could also be related to the test execution that precedes it.
If there's any possibility of having another set of eyes look at the codebase, we'd definitely be interested.
@cgewecke:
Most likely not related to the preceding script, as I also tried to truffle test this script alone, and the problem persisted.
I'm not sure that I'm allowed to post the code publicly at this point, so I will try to minimize both the Solidity contract and the Javascript test, and if I am still able to reproduce the problem then I will post it here.
Another interesting point of attention, is the fact that I haven't seen it happening when using testrpc-sc instead of ganache-cli.
So the bug can probably be reduced to the differences between those two (and also, we can probably eliminate the chances of it being related to the truffle suite itself or to mocha).
@barakman Ok great, thank you. One thing about testrpc-sc is that it's much slower. If the underlying issue is a race condition, testrpc-sc could be introducing delays that mask it. Another important difference is that it uses an older version of ethereumjs-vm (from the fall). Possible clue there too, although have been seeing these disconnections all year.
@cgewecke:
I can consistently reproduce the problem with the code below.
MyContract.sol:
pragma solidity ^0.4.18;
contract MyContract {
uint256 public constant ONE = 1000000000;
uint256 public constant GAMMA = 179437500000000000000000000000000000000000;
uint256 public constant DELTA = 29437500;
function buy(uint256 x, uint256 y, uint256 alpha, uint256 beta) external pure returns (uint256) {
uint256 temp = alpha - beta * y;
return x * (temp * ONE) / (temp * (ONE - DELTA) + GAMMA);
}
function sell(uint256 x, uint256 y, uint256 alpha, uint256 beta) external pure returns (uint256) {
uint256 temp = alpha - beta * y;
return x * (temp * (ONE + DELTA) - GAMMA) / (temp * ONE);
}
}
MyContractUnitTest.js:
contract("MyContractUnitTest", () => {
let interval = ["0", "20000000000000000000000000", "10000000000000000000000000000000000", "0"];
let NUM_OF_TESTS_PER_INTERVAL = 10;
function buyFunc(x, y, alpha, beta) {
let temp = alpha.minus(beta.times(y));
return x.times(temp.times(ONE)).dividedBy(temp.times(ONE.minus(DELTA)).plus(GAMMA));
}
function sellFunc(x, y, alpha, beta) {
let temp = alpha.minus(beta.times(y));
return x.times(temp.times(ONE.plus(DELTA)).minus(GAMMA)).dividedBy(temp.times(ONE));
}
async function buy(x, y, alpha, beta) {
let fixedPoint = await myContract.buy(x, y, alpha, beta);
let floatPoint = buyFunc(x, y, alpha, beta);
return [fixedPoint, floatPoint];
}
async function sell(x, y, alpha, beta) {
let fixedPoint = await myContract.sell(x, y, alpha, beta);
let floatPoint = sellFunc(x, y, alpha, beta);
return [fixedPoint, floatPoint];
}
let myContract;
let ONE;
let GAMMA;
let DELTA;
before(async () => {
myContract = await artifacts.require("MyContract.sol").new();
ONE = await myContract.ONE();
GAMMA = await myContract.GAMMA();
DELTA = await myContract.DELTA();
});
let AMOUNT = web3.toBigNumber(1000000);
for (let func of [buy, sell]) {
describe(`${func.name}:`, async () => {
for (let row = 0; row < 100; row++) {
for (let col = 0; col < 10; col++) {
let [minN, maxN, alpha, beta] = interval.map(x => web3.toBigNumber(x));
let incN = maxN.minus(minN).dividedBy(NUM_OF_TESTS_PER_INTERVAL - 1);
for (let i = 0; i < NUM_OF_TESTS_PER_INTERVAL; i++) {
let y = minN.plus(incN.times(i)).truncated();
it(`interval ${row} ${col}, test ${i}`, async () => {
try {
let [fixedPoint, floatPoint] = await func(AMOUNT, y, alpha, beta);
let ratio = fixedPoint.dividedBy(floatPoint);
assert(ratio.greaterThanOrEqualTo("0.99999") && ratio.lessThanOrEqualTo("1"), `ratio = ${ratio.toFixed()}`);
}
catch (error) {
assert(false, error.message);
}
});
}
}
}
});
}
});
My setup (as mentioned in a previous comment on this thread) is:
Thanks
@barakman Great!! Thank you. Going to open a companion issue over at ganache-cli and will talk to @benjamincburns about this and get his input.
@cgewecke: Also happens on testrpc-sc if you try "hard enough"...
@cgewecke : I am circulating around the conclusion that this problem stems from improper usage of the Mocha framework. More precisely, improper usage of the before, beforeEach, afterEach and after hooks.
The typical error messages, although poorly phrased, imply this conjecture as well:
"before all" hook...."before each" hook..."after each" hook..."after all" hook...You can read the relevant information here, namely:
You may also pick any file and add โrootโ-level hooks. For example, add
beforeEach()outside of alldescribe()blocks. This will cause the callback tobeforeEach()to run before any test case, regardless of the file it lives in.
So using these hooks on the root-level might be a bad idea in this case, since due to the nature of the tested system (communication with a TestRPC or Ganache process), they typically execute asynchronous code.
@barakman Agree it seems like disconnections happen at the 'seams' of the suites where the hooks are. The code at truffle-core here that sets up the 'contract' suite looks to me like it safely binds hooks within a describe and executes them asynchronously via Mocha's done callback. Do you see another place where they might be invoked?
@cgewecke:
I have removed all my root-level hooks, and i'm still encountering disconnections.
However, in opposed to before, the errors that I am getting are always "after each" hook....
And to be absolutely clear on that, I don't even have an afterEach hook in any of my tests!
Maybe Mocha adds implicit calls to afterEach, when an explicit call to beforeEach exists in the code.
However, so long as these implicit calls are not added at the root-level, it does not support my conjecture of hooked code running in the wrong scope to begin with.
In short, I'm at a loss here...
By the way, in the code that you linked, there doesn't seem to be support for the after hook, and I am using this hook in my tests.
@barakman Hmmm....that's a nice observation about after - I wonder if that's causing problems.
@cgewecke:
Thanks.
I removed all the after hooks in my code (replaced them with its), and now I'm getting a "before each" hook... error.
So I'm suspecting that one way or another - these hooks are not incorporated properly into Truffle.
In order to bring an empirical evidence for this conjecture, I am now trying the following workaround:
before hook with an itafter hook with an itbeforeEach hook to the beginning of each one of the its which followafterEach hook to the end of each one of the its which followIf all tests pass without disconnections, then I'm pretty sure that we can stamp this as the cause of the problem. And even if not, I think that it could still be related to hooks which are added implicitly by the Mocha framework.
@cgewecke:
I take back my previous conjecture of this error occurring on a given test as a result of something which has executed on a previous test.
This is because the error occurs when I execute truffle test separately for each test file (i.e., running truffle test test/SomeFile.js sequentially for each JS file).
Moreover, in between calls to truffle test, I close and reopen the ganache-cli process.
So this error cannot be related to any previous state stored by either truffle test or ganache-cli (unless one of them saves some "global information" in the operating-system's temp folder or something like that, which I sincerely doubt).
In addition to that, I have recently tested the new ganache-cli beta version (7.0.0-beta.0), where the problem persists.
I have posted my findings on a similar GIT thread which is closed by now, but I am hoping will reopen.
Thanks
@cgewecke:
I have conducted a more extensive research, by modifying file /node_modules/truffle/build/cli.bundled.js.
I started off by checking which path leads to the NOT_CONNECTED_MESSAGE error message ('Invalid JSON RPC response: ""'):
send: function(originalSend, preHook, postHook))sendAsync: function(originalSendAsync, preHook, postHook))As expected, this error occurs only in the asynchronous path.
Second, I added some logging in this path, just before invoking callback(error, result):
if (payload.params == undefined)
console.log(result.id, payload.method, 'no params');
else
console.log(result.id, payload.method, payload.params.length);
Here is the consistency that I have observed:
result.id starts from a very large value, and grows on each evm operation, for example:result.id starts from 1 and increments by 1 on each net or eth operation, for example:result.id and payload.method are undefined.result.id increments by 2 on the next net or eth operation, for example:result.id is undefined but payload.method is valid.payload.method is sometimes eth_getLogs for the first time in the entire test.I am hopeful that the above information will provide some clues towards the source of this problem.
And since I am able to reproduce it repeatedly, I will be happy to generate more logging in case you have any specific requests.
Thanks
@cgewecke:
Update to the above:
I later realized that:
id is in resultmethod is in payloadid is in result[0]method is in payload[0]I therefore changed the logging as follows:
if (result.id != undefined)
console.log(`Normal: id = ${result.id}, method = ${payload.method}`);
else if (result[0] != undefined)
console.log(`In [0]: id = ${result[0].id}, method = ${payload[0].method}`);
else
console.log(`Problem: result = ${result}, payload = ${JSON.stringify(payload, null, 4)}`);
The new logging has improved my previous observation from this:
result.id = undefined
payload.method = undefined
To this:
result[0].id = <the previous id + 1>
payload[0].method = eth_getFilterChanges
All of that, during normal execution of course.
What happens right before the error can be described as follows:
In the last one or two cases, the result is empty.
The contents which the result normally holds:
jsonrpc is still available in the payloadid is still available in the payloadresult is not available anywhereIs it possible that somewhere in ganache-cli code, the payload is initialized but the result is not?
That could most certainly be classified as an "Invalid JSON RPC response" (which you guys can easily resolve).
Thanks
Some more observations:
When the last two result objects are empty:
payload object always contains:method = eth_getLogsid = <some very large number>truffle test is always "after each" hook: after test for...When only the last result object are empty:
payload object always contains:method != eth_getLogsid = <the previous id + 1>truffle test is either "before each" hook: before test for... or No events were emitted@barakman Thank you. . . this analysis is really helpful.
Is it possible that somewhere in ganache-cli code, the payload is initialized but the result is not?
@benjamincburns Does anything jump out at you as a possibility in the preceding three comments?
@cgewecke:
I've been doing my research only on truffle side, thinking I wasn't able to do much on ganache side because all the code inside /node_modules/ganache appeared "compacted".
But then I realized that I could go directly to the source code on GitHub, so I'm now looking there too trying to connect the dots...
I'm doing it without logging this time, because I would essentially need to reinstall it (npm) after every change, so too much hustle...
Thank you!
@cgewecke:
BTW, preliminary observation in ganache code:
It seems that the jsonrpc field is mostly set to "2.0" (string) and rarely set to 2.0 (double).
Perhaps it doesn't make any difference when sent from the ganache process to the truffle process, but I would nevertheless make it consistent across the project.
@cgewecke:
A question about line 100 in file /ganache-core-develop/lib/provider.js:
callback(response.error ? err : null, response);
Looking a few lines above it, the variable response is either an object or an array of objects.
When it is an array of objects, I believe that response.error is undefined, hence the expression response.error ? err : null evaluates to null.
Is this really the desired behavior in this case?
Thanks
@barakman Just talked to @benjamincburns about this and he thinks it's ok. Have published an experimental truffle build which removes the eth_getLogs attempt in truffle test. If you have a chance could try running your suite with it and seeing if it makes any difference? Maybe we can isolate this problem to that call.
npm install -g darq-truffle@barakman
darq-truffle test # Example command
@cgewecke:
I will do so, thank you very much!!!
But as I mentioned above, I must also remind you that in some failure cases, the eth_getLogs method was not present in the erroneous payload (i.e, the payload which came along with an empty result did not contain this method).
@barakman Yes, I'm hoping that that's because the calls are getting queued over there or something. This is kind of a shot in the dark with a low likely-hood of success sorry.
@cgewecke:
My contracts are at Solidity 0.4.18, and my Truffle is subsequently at v4.13.
Can I use my local Truffle for compile and your proposed Truffle (which I have installed globally as you suggested) for test, or is it not likely to work?
@barakman Oh good question - the compiler is 0.4.23. If your pragmas use a caret you should have no problem. But if there's an issue let me know and I will republish with a build that lets you use any compiler you want.
@cgewecke:
In any case, the problem persists.
Here's the error message from (your) truffle:
[0m[31m Error: Could not connect to your Ethereum client. Please check that your Ethereum client:
- is running
- is accepting RPC connections (i.e., "--rpc" option is used in geth)
- is accessible over the network
- is properly configured in your Truffle configuration file (truffle.js)
[0m[90m
at ProviderError.ExtendableError (C:\Users\...\webpack:\dependencies\truffle-error\index.js:10:1)
at new ProviderError (C:\Users\...\webpack:\dependencies\truffle-provider\error.js:17:1)
at C:\Users\...\webpack:\dependencies\truffle-provider\wrapper.js:71:1
at C:\Users\...\webpack:\dependencies\truffle-provider\wrapper.js:129:1
at XMLHttpRequest.request.onreadystatechange (C:\Users\...\webpack:\dependencies\truffle-provider\~\web3\lib\web3\httpprovider.js:128:1)
at XMLHttpRequestEventTarget.dispatchEvent (C:\Users\...\webpack:\dependencies\truffle-provider\~\xhr2\lib\xhr2.js:64:1)
at XMLHttpRequest._setReadyState (C:\Users\...\webpack:\dependencies\truffle-provider\~\xhr2\lib\xhr2.js:354:1)
at XMLHttpRequest._onHttpRequestError (C:\Users\...\webpack:\dependencies\truffle-provider\~\xhr2\lib\xhr2.js:544:1)
at ClientRequest.<anonymous> (C:\Users\...\webpack:\dependencies\truffle-provider\~\xhr2\lib\xhr2.js:414:1)
at Socket.socketErrorListener (_http_client.js:387:9)
at emitErrorNT (internal/streams/destroy.js:64:8)
at _combinedTickCallback (internal/process/next_tick.js:138:11)
at process._tickCallback (internal/process/next_tick.js:180:9)
[0m
Note that there is no "after each" hook: after test for... in this case, which from previous empiric experience tells us that the last payload's method was not eth_getLogs.
But then again, it couldn't have been eth_getLogs anyway, since you've removed it, right?
I could run a few more tests to make sure that the error message never contains the "after each" hook: after test for... part.
It wouldn't really provide a great deal of insight, but perhaps it might indicate (if indeed the case, of course) that one path out of several paths which lead to this error has been resolved.
Thanks
@barakman Was just looking at your original reproduction case and noticed that the describe is async. Mocha doesn't support async for describe unfortunately. If you have a chance could you check and see if that occurs anywhere in the suite you're running? You can just change those to regular function to make sure they're running/resolving in strict sequence.
@cgewecke:
I have already changed that in all of my tests a while ago.
I have also posted a simplified version of this reproduction case quite recently, in which the describe takes a non-async function.
I posted it in order to inform benjamincburns that the problem persists with [email protected].
You can read it at the bottom of this thread.
@cgewecke:
I have added logging to your darq-truffle version:
sendAsync: function(originalSendAsync, preHook, postHook) {
...
if (result.id == undefined && result[0] != undefined)
console.log(`Problem: result = ${result}, payload = ${JSON.stringify(payload, null, 4)}`);
callback(error, result);
...
}
Now, upon Could not connect to your Ethereum client error, the log output seems to be either one of the following cases:
payload.method = eth_blockNumber, and the error message contains "before each" hook....payload.method = eth_call, and the error message does not contain any additional information.payload.method = net_version, and the error message does not contain any additional information.As expected, there seems to be a slight reduction in the number of different failure scenarios, with the payload.method no longer being eth_getLogs, and the error message no longer containing "after each" hook....
So perhaps the next place to look into is the other 3 methods mentioned above - eth_blockNumber, eth_call and net_version.
I would always feel better finding and fixing a single source of a given problem, which I'm pretty sure is also the case here, but perhaps we should start off by eliminating each path that leads to the problem, and eventually trace the exact point of failure.
@barakman Agree - I'll look into how to disable those for debugging . . .
@cgewecke:
The problem reproduces when I use GETH instead of Ganache.
I have also removed compiler optimization (see below) in order to rule out this factor as well:
solc: {
optimizer: {
enabled: false,
runs: 5000000,
}
}
Now, on the one hand, this implies that the problem is in Truffle.
But on the other hand, I'm guessing that Ganache and GETH are possibly using the same code-base.
So I'm not entirely confident that it is indeed Truffle to blame for.
@cgewecke:
A very important observation IMO:
request.onreadystatechange = function () {
if (request.readyState === 4 && request.timeout !== 1) {
var result = request.responseText;
var error = null;
try {
result = JSON.parse(result);
} catch (e) {
error = errors.InvalidResponse(request.responseText);
}
callback(error, result);
}
};
As long as request.status == 200, the responseText is a valid JSON RPC.
Once request.status == 0, the responseText is empty (hence an invalid JSON RPC).
Most likely, you need to ensure both request.readyState === 4 and request.status === 200.
I think that you may as well get rid of the request.timeout !== 1 assertion while you're at it, but I'm not an HTTP expert so I'll leave that decision to you.
I have not been able to resolve all my problems by adding this condition, as truffle simply terminates without any errors at all.
However, I'm pretty sure that the solution is tightly related with this issue.
It also explains the non-deterministic behavior (Heisenbug in your words) that we've witnessed, as HTTP responses tend to behave this way (in particularly with the regards to the partially-duplicated information embedded in the readyState and status fields).
@barakman Locating the problem here is a huge discovery, thank you. Ganache and Geth are very far apart from a code-base standpoint so I think this means the issue is very likely on this side. The code you've referenced is at web3. . .
@cgewecke:
But I see it in /node_modules/truffle/build/cli.bundled.js.
Are you "linking" Web3 code as part of Truffle's "build" process?
If so, how can we wrap things up on this issue?
i.e., should I post this on Web3's GitHub?
Thanks!
@barakman We bundle all of our dependencies together and one is web3 - I'm just noting the location so we know where to investigate. I'm not sure about what to do... I'd guess I'd like to look at the values coming through that block and understand this better. What states trigger this crash?
As long as request.status == 200, the responseText is a valid JSON RPC.
Once request.status == 0, the responseText is empty (hence an invalid JSON RPC).
Most likely, you need to ensure both request.readyState === 4 and request.status === 200.
@cgewecke:
If I understand your question of 'What states trigger this crash?' correctly, then the code that I've used in order to trigger this crash is given below (please let me know if that's not what you meant).
Using Truffle + GETH, the problem is actually reproduced very consistently, at around test 540 (though it may be affected by GETH verbosity since it is highly timing-dependent).
On-Chain:
pragma solidity ^0.4.18;
contract MyContract {
uint256 public constant ONE = 1000000000;
uint256 public constant GAMMA = 179437500000000000000000000000000000000000;
uint256 public constant DELTA = 29437500;
uint256 public constant AMOUNT = 1000000;
function buy(uint256 x, uint256 alpha, uint256 beta) external pure returns (uint256) {
uint256 temp = alpha - beta * x;
return AMOUNT * (temp * ONE) / (temp * (ONE - DELTA) + GAMMA);
}
function sell(uint256 x, uint256 alpha, uint256 beta) external pure returns (uint256) {
uint256 temp = alpha - beta * x;
return AMOUNT * (temp * (ONE + DELTA) - GAMMA) / (temp * ONE);
}
}
Off-Chain:
contract("MyContractTest", function() {
let hMyContract;
let ONE;
let GAMMA;
let DELTA;
let AMOUNT;
let NUM_OF_TESTS = 10;
let minN = web3.toBigNumber("0");
let maxN = web3.toBigNumber("20000000000000000000000000");
let alpha = web3.toBigNumber("10000000000000000000000000000000000");
let beta = web3.toBigNumber("0");
let incN = maxN.minus(minN).dividedBy(NUM_OF_TESTS - 1);
describe("accuracy assertion:", function() {
before(async function() {
hMyContract = await artifacts.require("MyContract.sol").new();
ONE = await hMyContract.ONE();
GAMMA = await hMyContract.GAMMA();
DELTA = await hMyContract.DELTA();
AMOUNT = await hMyContract.AMOUNT();
});
for (let func of [buy, sell]) {
for (let n = 0; n < 1000; n++) {
for (let i = 0; i < NUM_OF_TESTS; i++) {
it(`${func.name} test ${n} ${i}`, async function() {
let x = minN.plus(incN.times(i)).truncated();
let [fixedPoint, floatPoint] = await func(x, alpha, beta);
let ratio = fixedPoint.dividedBy(floatPoint);
assert(ratio.greaterThanOrEqualTo("0.99999"), `ratio = ${ratio.toFixed()}`);
assert(ratio.lessThanOrEqualTo("1"), `ratio = ${ratio.toFixed()}`);
});
}
}
}
});
function buyFunc(x, alpha, beta) {
let temp = alpha.minus(beta.times(x));
return AMOUNT.times(temp.times(ONE)).dividedBy(temp.times(ONE.minus(DELTA)).plus(GAMMA));
}
function sellFunc(x, alpha, beta) {
let temp = alpha.minus(beta.times(x));
return AMOUNT.times(temp.times(ONE.plus(DELTA)).minus(GAMMA)).dividedBy(temp.times(ONE));
}
async function buy(x, alpha, beta) {
let fixedPoint = await hMyContract.buy(x, alpha, beta);
let floatPoint = buyFunc(x, alpha, beta);
return [fixedPoint, floatPoint];
}
async function sell(x, alpha, beta) {
let fixedPoint = await hMyContract.sell(x, alpha, beta);
let floatPoint = sellFunc(x, alpha, beta);
return [fixedPoint, floatPoint];
}
});
Thanks
@barakman Excellent, thank you!
@cgewecke:
Side note:
I'm not sure how your bundling procedure works and how exactly you integrate Web3 into Truffle, but the HttpProvider.prototype.sendAsync function appears 4 times, out of which, 3 cases implement timeout handling (asserting request.timeout !== 1 and setting request.ontimeout), and 1 case doesn't.
I don't think that this is directly related to the problem at hand, but it might be implying that you are somehow bundling two different versions of Web3.
@cgewecke:
Some more observations, and a partial understanding of the problem:
I conducted a comparison between a successful HTTP request (valid responseText) and an unsuccessful HTTP request (empty responseText).
Now, the request object is rather large (let alone circular), so it is quite difficult to do a thorough comparison, but I did notice that the "bad" request contained the following (which the "good" request didn't):
{
Error: connect EADDRINUSE 127.0.0.1:8545
at Object._errnoException (util.js:1024:11)
at _exceptionWithHostPort (util.js:1046:20)
at TCPConnectWrap.afterConnect [as oncomplete] (net.js:1182:14)
[stack]: 'Error: connect EADDRINUSE 127.0.0.1:8545\n at Object._errnoException (util.js:1024:11)\n at _exceptionWithHostPort (util.js:1046:20)\n at TCPConnectWrap.afterConnect [as oncomplete] (net.js:1182:14)',
[message]: 'connect EADDRINUSE 127.0.0.1:8545',
code: 'EADDRINUSE',
errno: 'EADDRINUSE',
syscall: 'connect',
address: '127.0.0.1',
port: 8545
}
The EADDRINUSE error means that the ip port (127.0.0.1:8545 in this case) is currently busy, which made me think that perhaps this is not exactly a coding bug on either side.
I then checked the status of this port via (Windows command) netstat -aon | find "8545" and found something interesting:
As the test continues, more and more ports on 127.0.0.1 seem to be in use, waiting for something.
I believe that each one of them is waiting for a response on an HTTP request sent (asynchronously) to 127.0.0.1:8545.
When an unsuccessful HTTP request finally takes place, it seems that the highest port (65535) on 127.0.0.1 is waiting.
Here is what it looks like on my system:
First this:
TCP 127.0.0.1:8545 0.0.0.0:0 LISTENING 1820
TCP 127.0.0.1:49262 127.0.0.1:8545 TIME_WAIT 0
TCP 127.0.0.1:49263 127.0.0.1:8545 TIME_WAIT 0
TCP 127.0.0.1:49264 127.0.0.1:8545 TIME_WAIT 0
...
TCP 127.0.0.1:49275 127.0.0.1:8545 TIME_WAIT 0
Then this:
TCP 127.0.0.1:8545 0.0.0.0:0 LISTENING 1820
TCP 127.0.0.1:8545 127.0.0.1:49513 CLOSE_WAIT 1820
TCP 127.0.0.1:8545 127.0.0.1:49514 ESTABLISHED 1820
TCP 127.0.0.1:49262 127.0.0.1:8545 TIME_WAIT 0
TCP 127.0.0.1:49263 127.0.0.1:8545 TIME_WAIT 0
TCP 127.0.0.1:49264 127.0.0.1:8545 TIME_WAIT 0
... // mostly consecutive but not always
TCP 127.0.0.1:49512 127.0.0.1:8545 TIME_WAIT 0
TCP 127.0.0.1:49513 127.0.0.1:8545 FIN_WAIT_2 15044
TCP 127.0.0.1:49514 127.0.0.1:8545 ESTABLISHED 15044
Then this:
TCP 127.0.0.1:8545 0.0.0.0:0 LISTENING 1820
TCP 127.0.0.1:8545 127.0.0.1:58487 CLOSE_WAIT 1820
TCP 127.0.0.1:8545 127.0.0.1:58488 ESTABLISHED 1820
TCP 127.0.0.1:49262 127.0.0.1:8545 TIME_WAIT 0
TCP 127.0.0.1:49263 127.0.0.1:8545 TIME_WAIT 0
TCP 127.0.0.1:49264 127.0.0.1:8545 TIME_WAIT 0
... // mostly consecutive but not always
TCP 127.0.0.1:58486 127.0.0.1:8545 TIME_WAIT 0
TCP 127.0.0.1:58487 127.0.0.1:8545 FIN_WAIT_2 15044
TCP 127.0.0.1:58488 127.0.0.1:8545 ESTABLISHED 15044
Then this:
TCP 127.0.0.1:8545 0.0.0.0:0 LISTENING 1820
TCP 127.0.0.1:49262 127.0.0.1:8545 TIME_WAIT 0
TCP 127.0.0.1:49263 127.0.0.1:8545 TIME_WAIT 0
TCP 127.0.0.1:49264 127.0.0.1:8545 TIME_WAIT 0
... // mostly consecutive but not always
TCP 127.0.0.1:61586 127.0.0.1:8545 TIME_WAIT 0
TCP 127.0.0.1:61587 127.0.0.1:8545 TIME_WAIT 0
TCP 127.0.0.1:61588 127.0.0.1:8545 ESTABLISHED 15044
And finally this:
TCP 127.0.0.1:8545 0.0.0.0:0 LISTENING 1820
TCP 127.0.0.1:49152 127.0.0.1:8545 TIME_WAIT 0
TCP 127.0.0.1:49153 127.0.0.1:8545 TIME_WAIT 0
TCP 127.0.0.1:49156 127.0.0.1:8545 TIME_WAIT 0
... // mostly consecutive but not always
TCP 127.0.0.1:65533 127.0.0.1:8545 TIME_WAIT 0
TCP 127.0.0.1:65534 127.0.0.1:8545 TIME_WAIT 0
TCP 127.0.0.1:65535 127.0.0.1:8545 TIME_WAIT 0
Funny enough, even at this point, there still seem to be plenty of available ports between 0 and 65535 (more notably between 0 and 49000), but they don't seem to be used for some reason.
In any case, once the truffle test process terminates, those waiting ports seem to "clean up" gradually, until only the listening port (8545) remains.
That is when I can execute truffle test once again (without shutting down the Ethereum client, GETH in this case).
So perhaps the solution to this problem is in Truffle - maintain a counter of the open requests, and whenever it reaches a certain threshold, delay before submitting the next request.
Of course, I could also do this in my tests (i.e., Truffle input) but I don't think that's an appropriate solution.
Thanks!
@barakman RE: duplicate bundling - yes, that's likely happening. Web3 is widespread and not consistently versioned - have fixed this for the work we're doing on Truffle V5.
Can you tell approximately what the cap on the number of requests is? ~15,000?
@cgewecke:
This specific test starts off with 1 before call, which executes 5 asynchronous web3 calls.
It then proceeds with 20,000 it calls, each of which executing 1 asynchronous web3 call.
AFAIK (and you can probably verify this in Truffle), each web3 call yields 1 HTTP request to the Ethereum client (Ganache or GETH in this case).
On average, the test fails after 5,400 it calls or so. Hence if my assumption above is correct, then we're not talking about more than 5,000 pending requests.
Of course, the info that I provided on the previous comment does suggest ~15,000 requests, so I will run this again to make sure. I suppose that there were a lot of gaps (idle ports) in between 49152 and 65535.
Is it possible that you're not closing requests properly on Truffle. AFAIK there's no need to do that, but I'm not an HTTP expert so just bringing it up...
Thanks.
@cgewecke:
The exact number of busy ports immediately upon failure is 16353.
BTW, I just noticed that between 49152 and 65535 there are exactly 16384 ports, which also happens to be equal to 2^14. Not sure if it provides any additional insight though...
In any case, this means that there are HTTP requests issued by Truffle, which are not directly related to the test. Alternatively, my assumption of 1 request per web3 call is incorrect, and there are something like 3 or more requests per web3 call.
@cgewecke:
Some more analysis and an exact pinpoint of the problem (a solution if you will):
I have added a counter to test whether the number of opened HTTP requests in the system grows as the number of waiting ports seems to be growing.
I increase this counter after request.send(JSON.stringify(payload)) and decrease it in function request.onreadystatechange (upon request.readyState === 4).
The counter is zero when the failure occurs, which means that there are no opened requests at that point (this is in contrast with the number of waiting ports).
I have found a post suggesting that the TIME_WAIT period is configurable in the OS, but that would be an OS-dependent solution, which I'd really hate.
I have done some reading on the XMLHttpRequest object, to see if I could somehow use it in order to "signal someone on the system" that the request is done and that the port can be closed.
I haven't found any such option (in fact, I think that there is no really such thing as "HTTP connection close" defined in the HTTP standard).
I did notice, however, that for asynchronous requests, you are using XHR2 instead of XMLHttpRequest.
I'm not sure about the difference between these two, I only understand that the former is a NodeJS wrapping of the latter (which is a "Javascript native type").
Nevertheless, when I change the code to use XMLHttpRequest instead of XHR2, the test runs to completion!!!
Oddly enough, when the test is done, there are still some 16,000 ports in TIME_WAIT state.
However, this time, in addition to (something like) this:
TCP 127.0.0.1:49155 127.0.0.1:8545 TIME_WAIT 0
TCP 127.0.0.1:49157 127.0.0.1:8545 TIME_WAIT 0
TCP 127.0.0.1:49165 127.0.0.1:8545 TIME_WAIT 0
...
TCP 127.0.0.1:65532 127.0.0.1:8545 TIME_WAIT 0
TCP 127.0.0.1:65533 127.0.0.1:8545 TIME_WAIT 0
TCP 127.0.0.1:65534 127.0.0.1:8545 TIME_WAIT 0
I also see (something like) this:
TCP 127.0.0.1:8545 127.0.0.1:49152 TIME_WAIT 0
TCP 127.0.0.1:8545 127.0.0.1:49153 TIME_WAIT 0
TCP 127.0.0.1:8545 127.0.0.1:49154 TIME_WAIT 0
...
TCP 127.0.0.1:8545 127.0.0.1:65531 TIME_WAIT 0
TCP 127.0.0.1:8545 127.0.0.1:65534 TIME_WAIT 0
TCP 127.0.0.1:8545 127.0.0.1:65535 TIME_WAIT 0
I'm not sure why exactly the test completes, and whether or not we can even consider the replacement of XHR2 with XMLHttpRequest a solution (though it does seem like a good workaround by the least).
But I think that we should focus our investigation on the difference between these two.
Thanks.
@barakman Great work! So glad you got that suite running.
Was also googling around about this yesterday and saw a thread that suggests another possibility is to pass a special header into the request telling it to close the connection when done, since the default behavior for HTTP is keep-alive. Example:
var options = {host: 'graph.facebook.com',
port: 80,
path: '/' + fb_id + '/picture',
headers: { 'Connection':'Close' }
};
The relevant web3 code is here.
Truffle invokes that constructor at truffle-provider here. If the problem can be addressed by adding headers there we'd be able to fix this directly.
If not it's quite a bit more complicated - web3 is a library written and maintained by the Ethereum Foundation. We consume (rather than write) it and it's non-trivial to get the code changed there (for good reason since that code drives much of the Ethereum JS eco-system).
If you're still investigating this and have a chance, could you see if setting the headers that way also resolves this?
@cgewecke:
Thank you!
web3 authors / contributors of the XHR2 findings.truffle test relies on that anyway). The web3 class is globally available in all of my tests (not sure if because of Mocha or because of Truffle). So I'm not quite sure how or where to add this configuration.provider = new Web3.providers.HttpProvider("http://" + options.host + ":" + options.port);provider = new Web3.providers.HttpProvider("http://" + options.host + ":" + options.port, 0, '', '', [{name: 'Connection', value: 'Close'}]);Update:
For the code fix above, I get a message from Truffle (or from the Ethereum client):
Refused to set unsafe header "Connection"
I Googled it, and found this StackOverflow answer and this Web3 GitHub thread.
Do you have another suggestion?
Thanks.
You can workaround the Refused to set unsafe header "Connection" error as follows:
XMLHttpRequest.prototype._restrictedHeaders object.connection key or change its value from true to false.However, the bottom line result remains unchanged (i.e., the initial problem persists).
@barakman Ah no, sorry I don't - I guess that's a dead end. Hmmmm.
@cgewecke:
So the only option currently at hand is to connect in package.json a script which will modify Truffle source code, and call that script after npm install and before npm test?
@cgewecke:
BTW (and yet again), there seem to be several different versions of HttpProvider.prototype.prepareRequest "bundled together" in the same Truffle package.
One of them actually uses an XMLHttpRequest object for asynchronous requests, which is how we'd like it to be.
The way I see it, there are two options here:
XHR2 some time ago.XHR2 some time ago.The first case might make it easier to push forward towards reverting this change, which seems harmful.
The second case is even better - simply move Truffle to use the newer version of Web3.
See below the various occurrences of HttpProvider.prototype.prepareRequest in the code.
Occurrence 1:
HttpProvider.prototype.prepareRequest = function (async) {
var request;
if (async) {
request = new XHR2();
request.timeout = this.timeout;
}else {
request = new XMLHttpRequest();
}
request.open('POST', this.host, async);
request.setRequestHeader('Content-Type','application/json');
return request;
};
Occurrence 2:
HttpProvider.prototype.prepareRequest = function (async) {
var request = new XMLHttpRequest();
request.open('POST', this.host, async);
request.setRequestHeader('Content-Type','application/json');
return request;
};
Occurrence 3:
HttpProvider.prototype.prepareRequest = function (async) {
var request;
if (async) {
request = new XHR2();
request.timeout = this.timeout;
}else {
request = new XMLHttpRequest();
}
request.open('POST', this.host, async);
request.setRequestHeader('Content-Type','application/json');
return request;
};
Occurrence 4:
HttpProvider.prototype.prepareRequest = function (async) {
var request;
if (async) {
request = new XHR2();
request.timeout = this.timeout;
} else {
request = new XMLHttpRequest();
}
request.open('POST', this.host, async);
if (this.user && this.password) {
var auth = 'Basic ' + new Buffer(this.user + ':' + this.password).toString('base64');
request.setRequestHeader('Authorization', auth);
} request.setRequestHeader('Content-Type', 'application/json');
if(this.headers) {
this.headers.forEach(function(header) {
request.setRequestHeader(header.name, header.value);
});
}
return request;
};
Thanks
@barakman Which version of truffle are you using? I will track that down and if this can be fixed by normalizing web3 versions will do that ASAP.
@cgewecke:
At present, I am using Truffle v4.1.3, with my Solidity contracts under v0.4.18.
I am planning to move to Truffle 4.1.5 as soon as I have an idle slot, but that will force me to upgrade my Solidity contracts to v0.4.23, and due to the syntactical changes (namely emit, constructor and the deprecation of var), that idle slot will have to be a little wider than what it would take to just change Truffle version in package.json.
In short, I will be happy if this change (if indeed applicable) becomes available on Truffle v4.1.3, but Truffle v4.1.5 will also do just fine.
Thanks again for all your help!
@cgewecke:
Of course, it still needs to be asserted that this fix is not just some coincidental result due to the "timely-nature" of the problem (i.e., we must be able to explain it based on the functional difference between XHR2 and XMLHttpRequest).
@cgewecke:
A satisfactory proof:
In the HttpProvider.prototype.sendAsync function, I added console.log(request.getAllResponseHeaders()) upon response (in the onreadystatechange callback function).
When the HttpProvider.prototype.prepareRequest function uses XHR2, the printout form is:
content-type: application/json
vary: Origin
date: ...
content-length: ...
When the HttpProvider.prototype.prepareRequest function uses XMLHttpRequest, the printout form is:
content-type: application/json
vary: Origin
date: ...
content-length: ...
connection: close
@barakman
- Web3 has introduced the use of XHR2 some time ago.
- Web3 has revoked the use of XHR2 some time ago.
Unfortunately it looks like case 1 is true. XHR2 is used in the latest web3 0.x as wells as web3 1.0. Have also tried running your reproduction case using web3 1.0 over websockets without luck. . .
This issue raises questions about whether web3 / truffle / ganache are really suited to running simulations with tens of thousands of calls. There might be significant value in building a tool that ran tests directly on top of ethereumjs-vm, or perhaps inside ganache, avoiding http overhead and other constraints.
@cgewecke:
I did a little reading, and it seems that connections are closed by default in HTTP 1.0 and kept alive by default in HTTP 1.1. And I'm guessing that XMLHttpRequest supports HTTP 1.0 while XHR2 supports HTTP 1.1, so it makes sense that Web3 has switched from XMLHttpRequest to XHR2 and not vice-versa.
As with regards to the second part of your comment, please note that I have experienced the same problem when using solidity-coverage along with testrpc-sc. And as far as I understand, those two are designated specifically for the purpose of "running simulations with tens of thousands of calls" (how else would you achieve a complete coverage of your contracts?).
For now, I have added the following workaround on my system:
package.json, added file fix-truffle.js :FILE_NAME = "./node_modules/truffle/build/cli.bundled.js";
let fs = require("fs");
let oldData = fs.readFileSync(FILE_NAME, {encoding: "utf8"});
let newData = oldData.replace(/new XHR2/g, "new XMLHttpRequest");
fs.writeFileSync(FILE_NAME, newData, {encoding: "utf8"});
package.json, added: "scripts": {
"install": "node fix-truffle.js"
}
Thanks.
@cgewecke - just to finalize this issue (also for future readers):
The fix suggested above indeed seems to resolve the Could not connect to your Ethereum client problem discussed in this thread.
However, it exposes yet another problem:
Invalid JSON RPC response: "Error: socket hang up
at createHangUpError (_http_client.js:331:15)
at Socket.socketOnEnd (_http_client.js:423:23)
at emitNone (events.js:111:20)
at Socket.emit (events.js:208:7)
at endReadableNT (_stream_readable.js:1056:12)
at _combinedTickCallback (internal/process/next_tick.js:138:11)
at process._tickCallback (internal/process/next_tick.js:180:9)"
at ProviderError.ExtendableError (C:\Users\...\webpack:\~\truffle-error\index.js:10:1)
at new ProviderError (C:\Users\...\webpack:\~\truffle-provider\error.js:17:1)
at C:\Users\...\webpack:\~\truffle-provider\wrapper.js:71:1
at C:\Users\...\webpack:\~\truffle-provider\wrapper.js:129:1
at exports.XMLHttpRequest.request.onreadystatechange (C:\Users\...\webpack:\~\web3\lib\web3\httpprovider.js:128:1)
at exports.XMLHttpRequest.dispatchEvent (C:\Users\...\webpack:\~\xmlhttprequest\lib\XMLHttpRequest.js:591:1)
at setState (C:\Users\...\webpack:\~\xmlhttprequest\lib\XMLHttpRequest.js:610:1)
at exports.XMLHttpRequest.handleError (C:\Users\...\webpack:\~\xmlhttprequest\lib\XMLHttpRequest.js:532:1)
at ClientRequest.errorHandler (C:\Users\...\webpack:\~\xmlhttprequest\lib\XMLHttpRequest.js:459:1)
at Socket.socketOnEnd (_http_client.js:423:9)
at endReadableNT (_stream_readable.js:1056:12)
at _combinedTickCallback (internal/process/next_tick.js:138:11)
at process._tickCallback (internal/process/next_tick.js:180:9)
This problem seems to be of the following nature:
I believe that a possible fix for this problem is in the XMLHttpRequest function, around the area of:
request = doRequest(options, responseHandler).on("error", errorHandler);
Perhaps there's a missing handler for this request, for its socket, for its response or for its response's socket.
In either case, I have not been able to resolve it.
Most of my attempts were focused on searching NodeJS HTTP API for functions and/or events which might be used here.
The fact that a "massive" test completes successfully, but only when it takes place, does the next test emit this error (immediately when it begins) should give some hints, but I'm not sure what. It seems that the "massive" test does not release the socket when it is held for a long period (cutting this test shorter resolves the problem).
A simple workaround for this problem is to execute truffle test separately for each test file.
In other words, closing and reopening Truffle solves the problem, which implies that some resource (a socket?) is not released until Truffle is closed.
Unfortunately, this workaround is insufficient for solidity-coverage users (myself being among them), since this utility cannot be executed separately for each test file.
If someone can find a way to apply this ("close and reopen after every test file") in Truffle source code itself, then it might be a good solution.
I tried that too - in the Test.run function, at line js_tests.forEach(function(file)... - but couldn't quite get it to work.
@cgewecke:
I have managed to fix (or if you will, find a workaround for) the socket hang up issue described above, which has emerged after I had resolved the original issue (by replacing XHR2 with XMLHttpRequest).
As mentioned before, this socket hang up error seems to be pretty consistent in the fact that it happens only at the end of a massive test (or perhaps at the beginning of the test that follows).
A deeper investigation has shown that it always happens as a result of a request consisting of payload.method === 'evm_revert', to which the response is an error message (and obviously an invalid JSON).
A glimpse at Ganache source code reveals that evm_revert is indeed executed at the end of each test (using afterEach).
Though I don't have any real evidence to support this, I think that it is possibly because an evm_revert executed after a massive test takes a very long time to complete, during which the connection is timed out.
By the way, the status of this response is 0. I previously bumped into some GitHub thread referring to why you've decided not to ignore status 0 in Truffle (the reason being that a test might fail silently, if I remember correctly). I can't find this thread now, but you were in it, so you might find the remaining of this comment relevant.
In any case, in order to workaround the socket hang up error, I simply fixed Truffle source code to ignore an error in the response if the request's payload.method is evm_revert.
Since evm_revert is not really a part of any test which I could possibly run on Truffle, I am confident that this fix cannot do any harm, for example (yet again), allow a test to fail silently.
Here is the extended workaround (for both problems), for any future readers:
package.json, add file fix-truffle.js:let FILE_NAME = "./node_modules/truffle/build/cli.bundled.js";
let TOKENS = [
{prev: "request = new XHR2", next: "request = new XMLHttpRequest"},
{prev: "error = errors.InvalidResponse", next: "error = payload.method === 'evm_revert' ? null : errors.InvalidResponse"}
];
let fs = require("fs");
let data = fs.readFileSync(FILE_NAME, {encoding: "utf8"});
for (let token of TOKENS) {
data = data.replace(new RegExp(token.prev, "g"), token.next);
console.log(`replaced "${token.prev}" with "${token.next}"`);
}
fs.writeFileSync(FILE_NAME, data, {encoding: "utf8"});
package.json, add: "scripts": {
"install": "node fix-truffle.js"
}
Thanks
UPDATE:
It seems that even if a socket hang up error which occurs as a result of an evm_revert request at the end of a test is resolved (by ignoring it), a similar error may then occur as a result of an evm_snapshot request at the end of the next test.
We can slightly extend the workaround above to handle both cases, by changing this:
payload.method === 'evm_revert'
To this:
payload.method.startsWith('evm')
As evm requests are not something likely to be invoked directly from a testing script, I think that this extension is quite safe (i.e., will not cast away "real" errors in a given test).
However, generally speaking, I get the feeling that while Ganache takes a very long time to complete these requests in some cases (more specifically, after a massive test is conducted), the connection is simply (and abruptly) terminated.
The fact that restarting truffle test resolves this issue, implies that even if it is "Ganache's fault" (for taking so long to complete), it is "Truffle's fault" in handling it.
I am not very "happy" with the workaround proposed above, and I believe that a better approach would be to:
evm_revert and evm_snapshot.UPDATE 2:
For safety, extend this:
payload.method.startsWith('evm')
To this:
typeof payload.method === 'string' && payload.method.startsWith('evm')
Or even to this:
payload.method === 'evm_revert' || payload.method === 'evm_snapshot'
@barakman Thanks so much. The workaround you've proposed seems reasonable to me. There might be some kind of connection timeout at the HTTP layer - I've also seen this disconnection when running long solidity loops that validate bytecode in a call.
@barakman Out of curiosity, would making revert and snapshot optional help with your use case?
@cgewecke:
Thank you.
I assume that the purpose of these two functions is to reset the EVM emulation back to an initial state, so that each one of the tests executed by Truffle will start under the exact same conditions, regardless of the order in which the tests are executed (and of course, the exact same conditions will continue to apply every time you invoke truffle test).
All of this is designated to ensure deterministic execution, I assume, so making these functions optional is probably in contrast with correct testing methodologies.
That said, since it's optional, I guess that there's no harm done (i.e., Truffle users can choose that on their own risk).
That said #2, I've already added an npm-post-install script to fix Truffle source code, so I'm not in any dire need for this feature (though, I suppose I'll have to do some maintenance work on that script every time I update Truffle version, so perhaps it WILL help me in the future).
It would help for sure if you could check with Ganache developers what might cause the execution of evm_revert and evm_snapshot be so lengthy.
Thank you for your help.
@barakman
It would help for sure if you could check with Ganache developers what might cause the execution of evm_revert and evm_snapshot be so lengthy.
I will. In your current suite, approximately how many blocks are being snapshotted / reverted?
@cgewecke:
I have a total of 27 tests, so each one of these functions is invoked 27 times if that's what you mean.
Otherwise, can you please elaborate on what you mean by "how many blocks"?
Should I use web3 in order to get the block-number at the beginning and end of my longest test, and calculate the difference?
Apologies @barakman - yes you could do that or estimate the number of transactions that occur in the suite, since ganache executes a single tx per block.
I'd just like to give the ganache engineers a some guidance about what magnitude of tests triggers this.
@cgewecke:
Just by looking at the code, I estimate that:
evm_revert request fails, executes approximately 16943 RPCs.evm_revert request fails and the evm_snapshot request of following test also fails, executes approximately 28954 RPCs.I could give you more accurate figures by getting the block number before and after, but that would take me a while (each one of them runs for about 15-20 minutes or so).
Thanks
That's perfect, thanks @barakman.
Would there be an universal fix available any time soon?
I have random Error: CONNECTION ERROR: Couldn't connect to node http://127.0.0.1:7545/ errors when I do truffle test too (have 39 tests, 3 of which fail by that reason).
In May everything was still okay, today - it's not :(
@vicnaum Could you provide more detail about your suite or a link to project? At the moment we think this error is limited to very large suites. The principal reporter above has a battery of 50,000 tests.
Do the same 3 tests fail each time?
@cgewecke it's always different tests. Can be only one test failing, but can be at most five. Usually near three. I'm using Windows 10.
The sources are here: https://github.com/vicnaum/hourlyPay
@cgewecke : The error specified by vicnaum (connection error) does not seem to have any relation whatsoever with the issue described in this thread, which appears to be the result of limited resources (more precisely, the system runs out of HTTP connections).
@vicnaum I think @barakman is correct - I looked through the hourlyPay code a bit and see you're using a lot of methods to move time around on the chain. Would you like to open a separate issue so we can investigate further?
ganache-cli shouldn't disconnect from truffle under any circumstances so this is likely a bug. Could you display the entire contents of your error and stack trace as well?
@cgewecke & @barakman & others having this issue: I haven't dug into this too deeply, but my guess is that either Truffle or the tests in question are creating new instances of provider very frequently.
Optimal resource management would be to take advantage of HTTP keep alive by reusing provider instances between tests rather than recreating them.
I can say from experience that sending Connection: close in the request or explicitly closing the client socket only kicks the can down the road for this problem as you'll still exhaust the local address space due to ports sitting in FIN_WAIT.
@benjamincburns Yes, it turns out this originates at web3 and they're fixing it in beta.36.
(It was keep-alive - the change).
Closing this since it seems to have been addressed as a duplicate of the issue above. Let us know if it's still a problem. Thank you!
@gnidan:
AFAIK, this is still a problem on Truffle 4.1.15 (which still uses XHR2 instead of XMLHttpRequest).
In Truffle 5.x this is possibly fix, since this part of the code has changed, though I haven't verified that, as it requires a bit of work on both my contracts and my tests.
To my understanding, you have released 4.1.15 specifically for this reason (i.e., for those who aren't rushing to upgrade their Solc and Web3 major versions).
So you might want to keep this issue opened until fixed in the Truffle 4 branch (or at least leave a note somewhere to mention that this problem is as viable as ever).
Thanks
Most helpful comment
@benjamincburns Yes, it turns out this originates at web3 and they're fixing it in
beta.36.(It was keep-alive - the change).