Parity-ethereum: Warp snaphots don't get generated

Created on 13 Mar 2019  ·  19Comments  ·  Source: openethereum/parity-ethereum

  • Parity Ethereum version: 2.2.11
  • Operating system: Linux
  • Installation: docker
  • Fully synchronized: yes
  • Network: private
  • Restarted: yes

Warp snapshots don't get generated. I have node running with following setup

    command:
      --config /parity/config/authority.toml
      --engine-signer 0x002e28950558fbede1a9675cb113f0bd20912019
      --jsonrpc-interface 0.0.0.0
      --ws-interface 0.0.0.0
      --unsafe-expose
      --jsonrpc-cors all
      --no-persistent-txqueue
      --no-warp
      --jsonrpc-server-threads=8
      --jsonrpc-threads=8
      --no-discovery
      --fast-unlock
      --pruning-memory 1024
      --pruning-history 1000
      -lminer=trace,rpc=trace,txqueue=trace,engine=trace

And config.toml

[mining]
reseal_on_txs = "all"
reseal_min_period = 1000
reseal_max_period = 5000
gas_floor_target = "0x165A0BC00"
tx_queue_size = 16384
tx_queue_mem_limit = 4096
tx_queue_per_sender = 16384

With emptySteps enabled and without txs in the queue so the highest block was the same one during all the test.

I synced ~11k blocks but there is no warp snapshots anywhere. Logs don't contain any useful, just "syncing syncing syncing...".

2019-03-13 19:10:05 UTC jsonrpc-eventloop-1 TRACE rpc  Request: {"jsonrpc":"2.0","method":"eth_getBlockByHash","params":["0xff35258e32c98329746a76a02ad873512b2c9f0d02430819671c55c5ba49f6f6",false],"id":74414}.
2019-03-13 19:10:05 UTC jsonrpc-eventloop-1 DEBUG rpc  Response: {"jsonrpc":"2.0","result":{"author":"0x00aa39d30f0d20ff03a22ccfc30b7efbfca597c2","difficulty":"0xfffffffffffffffffffffffffffffffe","emptySteps":"[(cf561084e6adb90a3771cb91aa87fbdb21b561a43d4ff12e1fbbddca873a9dea448fc10e72b7a7d8ced78cd7e13e161acc7c3a1dba481394741e1fc13f2c334c00, 310479390, ab15cc847ed268d2987e93fc4738b6150454f5d869a35c5891d29f7de73c87e4),(5f5156d286e8bbfda6fbb4f42c209d1634f1324cc2dd4b517f6e14da54778d2377f0a6e9172f3541966f904a443287484747b005344d4836f7bae61f36ed64ec01, 310479391, ab15cc847ed268d2987e93fc4738b6150454f5d869a35c5891d29f7de73c87e4),(c9738fab60f8d5a3b26c0474130e22cb8eaca8fd9fb42a506fd06f8151363bc25cb0ac7593c27532bf48998ae22edb891d6c0eda3d6a2d959d211140015feb4f01, 310479392, ab15cc847ed268d2987e93fc4738b6150454f5d869a35c5891d29f7de73c87e4),(91cd7b0be2282b63876b93bcfda136063711a17af63a5cfdf779dcdc44aed12106ec5b2d3568d4b3c9f64048d1bc5bf9b406a67475cee52387545bf3292668a100, 310479393, ab15cc847ed268d2987e93fc4738b6150454f5d869a35c5891d29f7de73c87e4),(804d141bfb38b88e8ac88ea7b9817c2a156f958d62e57271897188294ef08f7723bb91835dede476cec0e0d430538aee2293ab2aa4ec1bb567a53212eb6031e501, 310479394, ab15cc847ed268d2987e93fc4738b6150454f5d869a35c5891d29f7de73c87e4),(afa4df381216fd4ed0a4bd8dd84695a658046bff20de39b3c57fbfec7de99e1176094196d26c5aa6a0f4b2f6a5d8e33e08553a33ae8f754c0d7fe2d71d305f1101, 310479395, ab15cc847ed268d2987e93fc4738b6150454f5d869a35c5891d29f7de73c87e4),(4cece13a9e9937bb64c87dc0edece624fb16c2b1edd25cf03bc2ca025783a2ff28ad9bdbe6e9733aa875c2a674975856698fae1b1f58df7bf22afaa9077d32fc01, 310479396, ab15cc847ed268d2987e93fc4738b6150454f5d869a35c5891d29f7de73c87e4),(434b266232ba6a18f16d932850dbc48cf8c8cbaa4565fe8b848664d9a490f3562c5d43264324eb39e4747dec898ab516f99fb4dd7e392e40be3625f5b730078500, 310479397, ab15cc847ed268d2987e93fc4738b6150454f5d869a35c5891d29f7de73c87e4),(c477b26ab650da3b8a227799f9bd9d7d5a7003114783374432013d84eed53b937c607eac2487e2bd06f2a6f07516b4aef9f467ed661617d0497b20a9dcb94cd000, 310479398, ab15cc847ed268d2987e93fc4738b6150454f5d869a35c5891d29f7de73c87e4),(3f2a0a74b15d622dbcedaaa364f6be2d467a0bc1a4bf7fcc367c8fa9455ecede2571e07e003c3eb4ee556075b3cb9cb3cba326071ebcdd3371109a968900675c00, 310479399, ab15cc847ed268d2987e93fc4738b6150454f5d869a35c5891d29f7de73c87e4)]","extraData":"0xde8302020b8f5061726974792d457468657265756d86312e33322e30826c69","gasLimit":"0x165a0bc00","gasUsed":"0x1c487b","hash":"0xff35258e32c98329746a76a02ad873512b2c9f0d02430819671c55c5ba49f6f6","logsBloom":"0x00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000","miner":"0x00aa39d30f0d20ff03a22ccfc30b7efbfca597c2","number":"0x2874","parentHash":"0xab15cc847ed268d2987e93fc4738b6150454f5d869a35c5891d29f7de73c87e4","receiptsRoot":"0x06e10f657a442bfe5b21471455533f70d528d8a3125419aaf243b81b711606c9","sealFields":["0x8412818a28","0xb8415a959e8cea70ddf7c80e28c31eec4fbf8d03c8b470cbb934287f89325a6282bd7e220ded4fb58270274ab3e6ad160632b965cdd93d75db0994a48fd7b99fa2f401","0xf902e4f848b841cf561084e6adb90a3771cb91aa87fbdb21b561a43d4ff12e1fbbddca873a9dea448fc10e72b7a7d8ced78cd7e13e161acc7c3a1dba481394741e1fc13f2c334c008412818a1ef848b8415f5156d286e8bbfda6fbb4f42c209d1634f1324cc2dd4b517f6e14da54778d2377f0a6e9172f3541966f904a443287484747b005344d4836f7bae61f36ed64ec018412818a1ff848b841c9738fab60f8d5a3b26c0474130e22cb8eaca8fd9fb42a506fd06f8151363bc25cb0ac7593c27532bf48998ae22edb891d6c0eda3d6a2d959d211140015feb4f018412818a20f848b84191cd7b0be2282b63876b93bcfda136063711a17af63a5cfdf779dcdc44aed12106ec5b2d3568d4b3c9f64048d1bc5bf9b406a67475cee52387545bf3292668a1008412818a21f848b841804d141bfb38b88e8ac88ea7b9817c2a156f958d62e57271897188294ef08f7723bb91835dede476cec0e0d430538aee2293ab2aa4ec1bb567a53212eb6031e5018412818a22f848b841afa4df381216fd4ed0a4bd8dd84695a658046bff20de39b3c57fbfec7de99e1176094196d26c5aa6a0f4b2f6a5d8e33e08553a33ae8f754c0d7fe2d71d305f11018412818a23f848b8414cece13a9e9937bb64c87dc0edece624fb16c2b1edd25cf03bc2ca025783a2ff28ad9bdbe6e9733aa875c2a674975856698fae1b1f58df7bf22afaa9077d32fc018412818a24f848b841434b266232ba6a18f16d932850dbc48cf8c8cbaa4565fe8b848664d9a490f3562c5d43264324eb39e4747dec898ab516f99fb4dd7e392e40be3625f5b7300785008412818a25f848b841c477b26ab650da3b8a227799f9bd9d7d5a7003114783374432013d84eed53b937c607eac2487e2bd06f2a6f07516b4aef9f467ed661617d0497b20a9dcb94cd0008412818a26f848b8413f2a0a74b15d622dbcedaaa364f6be2d467a0bc1a4bf7fcc367c8fa9455ecede2571e07e003c3eb4ee556075b3cb9cb3cba326071ebcdd3371109a968900675c008412818a27"],"sha3Uncles":"0x1dcc4de8dec75d7aab85b567b6ccd41ad312451b948a7413f0a142fd40d49347","signature":"5a959e8cea70ddf7c80e28c31eec4fbf8d03c8b470cbb934287f89325a6282bd7e220ded4fb58270274ab3e6ad160632b965cdd93d75db0994a48fd7b99fa2f401","size":"0x2121","stateRoot":"0x0c82eb3089dec969e879466ccfdce0bce907b34e08d2ef40b4f8692819f5520c","step":"310479400","timestamp":"0x5c87b2c8","totalDifficulty":"0x2873ffffffffffffffffffffffffed809ec8","transactions":["0xb23a735889544252074bff3ba0973d97230f8e35be96d6d399e368895ec2e84f"],"transactionsRoot":"0x97726a3d261aa67fd5be00d282ed0809ce7d158f940f5d59e01303408bd61356","uncles":[]},"id":74414}.
2019-03-13 19:10:05 UTC jsonrpc-eventloop-0 TRACE rpc  Request: {"jsonrpc":"2.0","method":"eth_getBlockTransactionCountByNumber","params":["pending"],"id":74415}.
2019-03-13 19:10:05 UTC jsonrpc-eventloop-0 DEBUG rpc  Response: {"jsonrpc":"2.0","result":"0x0","id":74415}.
2019-03-13 19:10:05 UTC IO Worker #1 TRACE engine  Fetched proposer for step 310500841: 0x00aa…97c2
2019-03-13 19:10:05 UTC IO Worker #1 TRACE engine  handle_message: received empty step message EmptyStep { signature: 0x76c9450ef8a42f1a9461b071538bf967baf9c4b0d35af079e3d5ed5235be6028014d7d6ac9af3420e4ce17c6a01b89365d27a1608c36a1f3b87f12e2ebbc7b8301, step: 310500841, parent_hash: 0xccc00e7266122f7307245707610fac3d35fd1c6b96074d993bc2a00ffadb710d }
2019-03-13 19:10:05 UTC jsonrpc-eventloop-0 TRACE rpc  Request: [{"jsonrpc":"2.0","method":"eth_syncing","params":[],"id":74416},{"jsonrpc":"2.0","method":"eth_getFilterChanges","params":["0x3"],"id":74417},{"jsonrpc":"2.0","method":"eth_getFilterChanges","params":["0x2"],"id":74418}].
2019-03-13 19:10:05 UTC jsonrpc-eventloop-0 DEBUG rpc  Response: [{"jsonrpc":"2.0","result":{"currentBlock":"0x2875","highestBlock":"0x0","startingBlock":"0x0","warpChunksAmount":null,"warpChunksProcessed":null},"id":74416},{"jsonrpc":"2.0","result":["0xef6c17ad75181568aae691da743473d9abc10db4c0c8fbe3f9a14e66f11cf236"],"id":74417},{"jsonrpc":"2.0","result":[],"id":74418}].
2019-03-13 19:10:05 UTC jsonrpc-eventloop-1 TRACE rpc  Request: {"jsonrpc":"2.0","method":"eth_getBlockByNumber","params":["0x2875",false],"id":74419}.
2019-03-13 19:10:05 UTC jsonrpc-eventloop-1 DEBUG rpc  Response: {"jsonrpc":"2.0","result":{"author":"0x002e28950558fbede1a9675cb113f0bd20912019","difficulty":"0xfffffffffffffffffffffffffffffffe","emptySteps":"[]","extraData":"0xde8302020b8f5061726974792d457468657265756d86312e33322e30826c69","gasLimit":"0x165a0bc00","gasUsed":"0x6843abf","hash":"0xef6c17ad75181568aae691da743473d9abc10db4c0c8fbe3f9a14e66f11cf236","logsBloom":"0x00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000","miner":"0x002e28950558fbede1a9675cb113f0bd20912019","number":"0x2875","parentHash":"0xff35258e32c98329746a76a02ad873512b2c9f0d02430819671c55c5ba49f6f6","receiptsRoot":"0x409a14af6a729678da69fa3aab2cf5ae75aad1804821026d76cc76651b2385db","sealFields":["0x8412818a29","0xb8416fc2fdefe7d67bf00682b66ac837264ce94068d7093a4a9e127a868cae6eed976938e4a92dd85d797fd6a0dfb550944af5660897aba5e0c3c730f7243cc4a84801","0xc0"],"sha3Uncles":"0x1dcc4de8dec75d7aab85b567b6ccd41ad312451b948a7413f0a142fd40d49347","signature":"6fc2fdefe7d67bf00682b66ac837264ce94068d7093a4a9e127a868cae6eed976938e4a92dd85d797fd6a0dfb550944af5660897aba5e0c3c730f7243cc4a84801","size":"0x2c4","stateRoot":"0x512c6a64e0cff8640df666bf2642762dc351f1fea24f6acd527493c59955fde4","step":"310479401","timestamp":"0x5c87b2cd","totalDifficulty":"0x2874ffffffffffffffffffffffffed809ec6","transactions":["0x247a3cb148f90871c2d6b99d75112d5425432f3c5ace032676f5316ea6dc6267"],"transactionsRoot":"0x0127e19b73978ea6015a9e34c2ac4287eeae01dc8f415b2ed7c1e5753f2a219d","uncles":[]},"id":74419}.
2019-03-13 19:10:05 UTC jsonrpc-eventloop-0 TRACE rpc  Request: {"jsonrpc":"2.0","method":"eth_getBlockTransactionCountByNumber","params":["pending"],"id":74420}.
2019-03-13 19:10:05 UTC jsonrpc-eventloop-0 DEBUG rpc  Response: {"jsonrpc":"2.0","result":"0x0","id":74420}.

P.S. No, upgrading to newer version doesn't fix the issue.

F2-bug 🐞 M4-core ⛓

All 19 comments

Hi @Pzixel, I'm trying to reproduce your setup. I understand you are launching parity-poa-playground. The config.toml you posted is your modification to parity/config/authority.toml, right?

Hello. Yes, you are completely right. It's indeed a slight modification of feature/ws branch of formentioned repo.

Ok, I checked that branch out and I see those values in the config file now.

Hey!. Any progress on that?

Hi! Sorry, I'll be looking at it today. Meanwhile, if you will, you can try running with -lsnapshot=trace,snapshot_watcher=trace, those should be relevant.

Thanks, gonna try it!

I'm running a modified version of the code that takes snapshots every 500 blocks instead of 5000 and I got this on the logs of authority0:

authority0_1  | 2019-03-19 10:33:40 UTC Taking snapshot at #500
authority0_1  | 2019-03-19 10:33:40 UTC Taking snapshot starting at block 500
authority0_1  | 2019-03-19 10:33:40 UTC Finished taking snapshot at #500

Now:

# ls -l /var/lib/docker/volumes/paritypoaplayground_data-volume/_data/97552686397107f6/snapshot/current/
total 68
-rw-r--r-- 1 elferdo elferdo 133 mar 19 11:33 1534d9627e9b95037f4a39efcd71046fbca8d6b9b1e4cf30e415c995a7ee9cd0
-rw-r--r-- 1 elferdo elferdo 150 mar 19 11:33 285b7806e63119304e8e98d690f0c5c6f8d954e704173d4f5ea172005e2490ed
-rw-r--r-- 1 elferdo elferdo  55 mar 19 11:33 4594332d390e06f23cd9019b7e4fcb684b0558479597cc81aabc881613c319dc
-rw-r--r-- 1 elferdo elferdo  55 mar 19 11:33 61f941a170bf9a96da45029d66e226ace4bbe7ceda127bba082934200601d00e
-rw-r--r-- 1 elferdo elferdo 109 mar 19 11:33 6b03638b667270dedacbd058738893b0fbbc27ecf8308238a83cfbdf45480024
-rw-r--r-- 1 elferdo elferdo 109 mar 19 11:33 7e5e31b5e7379870877f1e3fc349c485ad9463c69672a46dba390a44bb758ac2
-rw-r--r-- 1 elferdo elferdo 109 mar 19 11:33 833c46011723dfc9f55c3f476ec9ff9fa9bb376515bc1799e3ce4b98e03d45fc
-rw-r--r-- 1 elferdo elferdo  43 mar 19 11:33 8c2ad589bd9aaab1a2901b85647e20637b997ec31fdd303fcc9caf7cd353f565
-rw-r--r-- 1 elferdo elferdo  55 mar 19 11:33 95a358ecf969460f5874126be602ae51f8fa72b057b46501770fb23d12e3223f
-rw-r--r-- 1 elferdo elferdo 109 mar 19 11:33 a517d869ee7a692c99655c0559f0bec776dab9544cd7fdbf1f6b43702e4c8b65
-rw-r--r-- 1 elferdo elferdo  97 mar 19 11:33 ad033b15a55798d811bdaebe71b9c3548544d3b8b12f6ea76abeb086f7932190
-rw-r--r-- 1 elferdo elferdo 106 mar 19 11:33 b980382a1b57c3ac31b0ae32af254b6f851d12d4cb61ce8346664bde2f26b3e6
-rw-r--r-- 1 elferdo elferdo  57 mar 19 11:33 bd66112794cc57d7ecdb3eb55ea288752ab39469fd94a64670e73dca918f10a5
-rw-r--r-- 1 elferdo elferdo 133 mar 19 11:33 e5264217357dd1a919016bca1619c863f46bab43e0074f944fc95eead1f145aa
-rw-r--r-- 1 elferdo elferdo 947 mar 19 11:33 e87127c0176c1d74fc81c10029221ba0dc882e984c46253eceed0abeccda2eb3
-rw-r--r-- 1 elferdo elferdo  43 mar 19 11:33 eecfa66f5afbc15766c377f2c30545003028a1e982fe519f3a72e9ed30b72fd4
-rw-r--r-- 1 elferdo elferdo 605 mar 19 11:33 MANIFEST

Can you please check your docker volume if you find anything there?

No, nothing there.

Running with

    command:
      --config /parity/config/member.toml
      --jsonrpc-interface 0.0.0.0
      --ws-interface 0.0.0.0
      --unsafe-expose
      --jsonrpc-cors all
      --no-persistent-txqueue
      --jsonrpc-server-threads=8
      --jsonrpc-threads=8
      --no-discovery
      --fast-unlock
      --no-warp
      -lsnapshot=trace,snapshot_watcher=trace

logs.txt

Just regular "syncing stuff".

└># du -h
4.0K    ./324e784c3650c014/snapshot
4.0K    ./324e784c3650c014/overlayrecent/db/trace_blooms
124K    ./324e784c3650c014/overlayrecent/db/blooms
4.9G    ./324e784c3650c014/overlayrecent/db
4.9G    ./324e784c3650c014/overlayrecent
4.9G    ./324e784c3650c014
4.9G    .

Could you please check/post the logs of the authority that is running -lsnapshot_watcher=trace? You should be seing something like:

authority2_1  | 2019-03-19 11:16:25 UTC IO Worker #3 TRACE snapshot_watcher  1 imported
authority2_1  | 2019-03-19 11:16:30 UTC Verifier #5 TRACE snapshot_watcher  1 imported
authority2_1  | 2019-03-19 11:16:35 UTC Verifier #9 TRACE snapshot_watcher  1 imported
authority2_1  | 2019-03-19 11:16:40 UTC IO Worker #0 TRACE snapshot_watcher  1 imported
authority2_1  | 2019-03-19 11:16:45 UTC Verifier #1 TRACE snapshot_watcher  1 imported
authority2_1  | 2019-03-19 11:16:50 UTC Verifier #6 TRACE snapshot_watcher  1 imported
authority2_1  | 2019-03-19 11:16:55 UTC IO Worker #2 TRACE snapshot_watcher  1 imported
authority2_1  | 2019-03-19 11:17:00 UTC Verifier #10 TRACE snapshot_watcher  1 imported
authority2_1  | 2019-03-19 11:17:06 UTC Verifier #9 TRACE snapshot_watcher  1 imported

This "watcher" is the one who triggers snapshotting at regular intervals, and it looks like:

authority2_1  | 2019-03-19 11:15:56 UTC IO Worker #1 TRACE snapshot_watcher  broadcast: 1000
authority2_1  | 2019-03-19 11:15:56 UTC Periodic Snapshot INFO ethcore::snapshot::service  Taking snapshot at #1000
authority2_1  | 2019-03-19 11:15:56 UTC Periodic Snapshot INFO ethcore::snapshot  Taking snapshot starting at block 1000
authority2_1  | 2019-03-19 11:15:56 UTC Periodic Snapshot INFO snapshot  Using 6 threads for Snapshot creation.
authority2_1  | 2019-03-19 11:15:56 UTC  DEBUG snapshot  Chunking part 3 in thread 3
authority2_1  | 2019-03-19 11:15:56 UTC  DEBUG snapshot  Chunking part 1 in thread 1

In your case, that should happen at block 5000.

This is logs from node with -lsnapshot_watcher=trace as you could see in the post above (i.e. -lsnapshot=trace,snapshot_watcher=trace). And I see no formentioned entries in the log. I could restart to make sure setting applied once more, but it will take a while until it sync at least 5k blocks.

I am comparing the command you posted first with my checkout of feature/ws and I see some differences, in particular you are passing --no-warp. Do you launch all nodes with that command?

Yes, I think it affects the syncing only.

I remove it and restart. Let's see what happens.

I'll try that.

I just changed my command options to match yours, including --no-warp (which actually only affects syncing, snapshot are still created), and a snapshot was created at 1500.

I took your bootnodes and pasted them in the member.toml file from parity-poa-playground/feature/ws. I take chain.json as it is and run:

target/release/parity -lsync=trace,snapshot=trace,snapshot_watcher=trace,import=trace --config member.toml --no-persistent-txqueue --fast-unlock

It does not seem to be syncing. Output looks like:

2019-03-19 15:32:05  main INFO parity_ethereum::run  Starting Parity-Ethereum/v2.2.1-beta-5c56fc502-20181114/x86_64-linux-gnu/rustc1.32.0
2019-03-19 15:32:05  main INFO parity_ethereum::run  Keys path /home/elferdo/.local/share/io.parity.ethereum/keys/parity-poa-playground
2019-03-19 15:32:05  main INFO parity_ethereum::run  DB path /home/elferdo/.local/share/io.parity.ethereum/chains/parity-poa-playground/db/97552686397107f6
2019-03-19 15:32:05  main INFO parity_ethereum::run  State DB configuration: fast
2019-03-19 15:32:05  main INFO parity_ethereum::run  Operating mode: active
2019-03-19 15:32:05  main INFO ethcore_service::service  Configured for parity-poa-playground using AuthorityRound engine
2019-03-19 15:32:06  main INFO parity_ethereum::run  Running without a persistent transaction queue.
2019-03-19 15:32:06  IO Worker #1 INFO network  Public node URL: enode://4db112d6355cec36c617b3369601e406963f6e76353c5040883ff85f38a92b2f51f4d5740a827b475e8d1de8885d8c08eb9aaf2b402ddf61c5ce9efb4912d06a@192.168.178.3:30303
2019-03-19 15:32:36  IO Worker #0 INFO import     0/25 peers   8 KiB chain 15 KiB db 0 bytes queue 448 bytes sync  RPC:  0 conn,    0 req/s,    0 µs
2019-03-19 15:33:06  IO Worker #3 INFO import     0/25 peers   8 KiB chain 15 KiB db 0 bytes queue 448 bytes sync  RPC:  0 conn,    0 req/s,    0 µs
2019-03-19 15:33:36  IO Worker #3 INFO import     0/25 peers   8 KiB chain 15 KiB db 0 bytes queue 448 bytes sync  RPC:  0 conn,    0 req/s,    0 µs

I can ping to your machines, that's not an issue. Actually, with -lnetwork=trace I see nodes are actually connecting, but for some reason they're not syncing:

2019-03-19 15:42:15  IO Worker #2 TRACE network  0x9e00…0e62: Connecting to V4(94.79.51.219:30303)
2019-03-19 15:42:15   TRACE network  connection reregister; token=Token(3)
2019-03-19 15:42:15  IO Worker #2 TRACE network  2: Initiating session Some(0x9e0036d4200f4a6124cf02ae0f760d04ff213d96344e02fe181bb18a2710a2b8ab85cd3e17073b77a723724b13a3e7ffd49451571464a5414a2ce44e92f50e62)
2019-03-19 15:42:15   TRACE network  connection register; token=Token(3)
2019-03-19 15:42:15  IO Worker #2 TRACE network  Sending handshake auth to "Unknown"
2019-03-19 15:42:15  IO Worker #2 TRACE network  2: Sending 307 bytes
2019-03-19 15:42:15  IO Worker #2 TRACE network  Expect to read 210 bytes
2019-03-19 15:42:15  IO Worker #2 DEBUG network  Connecting peers: 0 sessions, 1 pending + 4 started

Sorry, I forgot they have another chain.json, because it's not a test environment.

Here they are:

chain.json.txt

Network name is MoscowFairs

Great, let's give that a try.

So I'm suspicious of this line:

2019-03-19 15:42:15  IO Worker #2 TRACE network  Sending handshake auth to "Unknown"

I should be seeing an IP and not "Unknown". Could you please check with -lnetwork=trace if that is so on your side? It comes up right at the beginning of the log.

Closing issue due to its stale state.

Was this page helpful?
0 / 5 - 0 ratings