_Before filing a new issue, please provide the following information._
I'm running:
- Parity version: Parity/v1.8.0-unstable-b9c1d0bd1-20170814/x86_64-macos/rustc1.18.0
- Operating system: MacOS
- And installed: from source (
rustc 1.20.0-nightly (6d9d82d3d 2017-07-14))
_Your issue description goes here below. Try to include actual vs. expected behavior and steps to reproduce the issue._
I tried running parity --light --scale-verifiers --num-verifiers 4 --fast-and-loose (and parity --light) but both times I get stuck at:
2017-08-16 21:42:48 Syncing #1925655 ac02…fc53 0 hdr/s 0+ 0 Qed #1925655 9/50 peers 2 MiB cache 0 bytes queue RPC: 1 conn, 4 req/s, 139 µs
I would expect it to sync all the way to the current tip, which is currently 4168389.
Can reproduce this, at a different block number though. Sync simply stalls.
2017-08-17 13:47:11 Syncing #2210563 73a6…e17b 658 hdr/s 16027+ 0 Qed #2210563 4/50 peers 10 MiB cache 13 MiB queue RPC: 0 conn, 0 req/s, 0 µs
2017-08-17 13:47:16 Syncing #2213435 2385…a45f 573 hdr/s 21088+ 0 Qed #2213435 3/50 peers 10 MiB cache 18 MiB queue RPC: 0 conn, 0 req/s, 0 µs
2017-08-17 13:47:21 Syncing #2216057 22ad…1b4d 524 hdr/s 18464+ 0 Qed #2216057 5/50 peers 10 MiB cache 16 MiB queue RPC: 0 conn, 0 req/s, 0 µs
2017-08-17 13:47:26 Syncing #2219259 4a67…0f16 640 hdr/s 15259+ 0 Qed #2219259 6/50 peers 10 MiB cache 13 MiB queue RPC: 0 conn, 0 req/s, 0 µs
2017-08-17 13:47:31 Syncing #2220819 6237…fc34 312 hdr/s 13706+ 0 Qed #2220819 6/50 peers 10 MiB cache 12 MiB queue RPC: 0 conn, 0 req/s, 0 µs
2017-08-17 13:47:36 Syncing #2223154 d00e…a9a0 467 hdr/s 30057+ 0 Qed #2223154 6/50 peers 10 MiB cache 25 MiB queue RPC: 0 conn, 0 req/s, 0 µs
2017-08-17 13:47:41 Syncing #2225086 f6ad…10af 386 hdr/s 30047+ 0 Qed #2225086 6/50 peers 10 MiB cache 25 MiB queue RPC: 0 conn, 0 req/s, 0 µs
2017-08-17 13:47:46 Syncing #2226739 4341…f996 329 hdr/s 30057+ 0 Qed #2226739 6/50 peers 10 MiB cache 25 MiB queue RPC: 0 conn, 0 req/s, 0 µs
2017-08-17 13:47:51 Syncing #2228482 aefb…2227 348 hdr/s 29970+ 0 Qed #2228482 6/50 peers 10 MiB cache 25 MiB queue RPC: 0 conn, 0 req/s, 0 µs
2017-08-17 13:47:56 Syncing #2230053 28bb…2479 314 hdr/s 30071+ 0 Qed #2230053 6/50 peers 10 MiB cache 25 MiB queue RPC: 0 conn, 0 req/s, 0 µs
2017-08-17 13:48:01 Syncing #2231811 cb7f…0e23 351 hdr/s 30105+ 0 Qed #2231811 6/50 peers 10 MiB cache 25 MiB queue RPC: 0 conn, 0 req/s, 0 µs
2017-08-17 13:48:06 Syncing #2233681 a2fc…d4ac 374 hdr/s 30028+ 0 Qed #2233681 6/50 peers 10 MiB cache 25 MiB queue RPC: 0 conn, 0 req/s, 0 µs
2017-08-17 13:48:11 Syncing #2235395 9bb2…424c 342 hdr/s 29979+ 0 Qed #2235395 6/50 peers 10 MiB cache 25 MiB queue RPC: 0 conn, 0 req/s, 0 µs
2017-08-17 13:48:16 Syncing #2237312 c947…a683 383 hdr/s 29981+ 0 Qed #2237312 6/50 peers 10 MiB cache 25 MiB queue RPC: 0 conn, 0 req/s, 0 µs
2017-08-17 13:48:21 Syncing #2239486 4495…dd91 434 hdr/s 29984+ 0 Qed #2239486 6/50 peers 10 MiB cache 25 MiB queue RPC: 0 conn, 0 req/s, 0 µs
2017-08-17 13:48:26 Syncing #2241645 36cf…75e4 431 hdr/s 29997+ 0 Qed #2241645 6/50 peers 10 MiB cache 25 MiB queue RPC: 0 conn, 0 req/s, 0 µs
2017-08-17 13:48:31 Syncing #2243605 b02d…e993 391 hdr/s 30085+ 0 Qed #2243605 6/50 peers 10 MiB cache 25 MiB queue RPC: 0 conn, 0 req/s, 0 µs
2017-08-17 13:48:36 Syncing #2245970 2817…6904 473 hdr/s 30005+ 0 Qed #2245970 6/50 peers 10 MiB cache 25 MiB queue RPC: 0 conn, 0 req/s, 0 µs
2017-08-17 13:48:41 Syncing #2249369 d2d9…5ef1 679 hdr/s 29953+ 0 Qed #2249369 6/50 peers 10 MiB cache 25 MiB queue RPC: 0 conn, 0 req/s, 0 µs
2017-08-17 13:48:46 Syncing #2251805 9b09…9178 487 hdr/s 29952+ 0 Qed #2251805 6/50 peers 10 MiB cache 25 MiB queue RPC: 0 conn, 0 req/s, 0 µs
2017-08-17 13:48:51 Syncing #2255426 88e2…708f 724 hdr/s 29917+ 1 Qed #2255426 6/50 peers 10 MiB cache 25 MiB queue RPC: 0 conn, 0 req/s, 0 µs
2017-08-17 13:48:56 Syncing #2258520 db17…d775 618 hdr/s 30021+ 0 Qed #2258520 6/50 peers 10 MiB cache 25 MiB queue RPC: 0 conn, 0 req/s, 0 µs
2017-08-17 13:49:06 Syncing #2265371 520d…48c5 685 hdr/s 23684+ 0 Qed #2265371 1/50 peers 10 MiB cache 20 MiB queue RPC: 0 conn, 0 req/s, 0 µs
2017-08-17 13:49:11 Syncing #2268282 ef28…2c8f 582 hdr/s 20768+ 0 Qed #2268282 3/50 peers 10 MiB cache 17 MiB queue RPC: 0 conn, 0 req/s, 0 µs
2017-08-17 13:49:16 Syncing #2271232 b38c…f521 590 hdr/s 17822+ 0 Qed #2271232 1/50 peers 10 MiB cache 15 MiB queue RPC: 0 conn, 0 req/s, 0 µs
2017-08-17 13:49:21 Syncing #2274693 438a…f09a 692 hdr/s 14619+ 0 Qed #2274693 3/50 peers 10 MiB cache 12 MiB queue RPC: 0 conn, 0 req/s, 0 µs
2017-08-17 13:49:26 Syncing #2278338 c31f…cfdc 729 hdr/s 11484+ 0 Qed #2278338 2/50 peers 10 MiB cache 10 MiB queue RPC: 0 conn, 0 req/s, 0 µs
2017-08-17 13:49:31 Syncing #2279999 6d4b…bf7c 332 hdr/s 10080+ 0 Qed #2279999 2/50 peers 10 MiB cache 8 MiB queue RPC: 0 conn, 0 req/s, 0 µs
2017-08-17 13:49:36 Syncing #2282730 ed58…e97f 546 hdr/s 8116+ 0 Qed #2282730 2/50 peers 10 MiB cache 7 MiB queue RPC: 0 conn, 0 req/s, 0 µs
2017-08-17 13:49:41 Syncing #2285819 71a2…ecc8 617 hdr/s 5020+ 0 Qed #2285819 1/50 peers 10 MiB cache 4 MiB queue RPC: 0 conn, 0 req/s, 0 µs
2017-08-17 13:49:46 Syncing #2288706 3b09…e163 577 hdr/s 2139+ 0 Qed #2288706 1/50 peers 10 MiB cache 2 MiB queue RPC: 0 conn, 0 req/s, 0 µs
2017-08-17 13:49:51 Syncing #2289059 209b…46ad 70 hdr/s 0+ 0 Qed #2289059 1/50 peers 10 MiB cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2017-08-17 13:49:56 Syncing #2289059 209b…46ad 0 hdr/s 0+ 0 Qed #2289059 1/50 peers 10 MiB cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2017-08-17 13:50:01 Syncing #2289059 209b…46ad 0 hdr/s 0+ 0 Qed #2289059 1/50 peers 10 MiB cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2017-08-17 13:50:06 Syncing #2289059 209b…46ad 0 hdr/s 0+ 0 Qed #2289059 1/50 peers 10 MiB cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2017-08-17 13:50:11 Syncing #2289059 209b…46ad 0 hdr/s 0+ 0 Qed #2289059 1/50 peers 10 MiB cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2017-08-17 13:50:16 Syncing #2289059 209b…46ad 0 hdr/s 0+ 0 Qed #2289059 1/50 peers 10 MiB cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2017-08-17 13:50:21 Syncing #2289059 209b…46ad 0 hdr/s 0+ 0 Qed #2289059 1/50 peers 10 MiB cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2017-08-17 13:50:31 Syncing #2289059 209b…46ad 0 hdr/s 0+ 0 Qed #2289059 1/50 peers 10 MiB cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2017-08-17 13:50:36 Syncing #2289059 209b…46ad 0 hdr/s 0+ 0 Qed #2289059 1/50 peers 10 MiB cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2017-08-17 13:50:41 Syncing #2289059 209b…46ad 0 hdr/s 0+ 0 Qed #2289059 1/50 peers 10 MiB cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2017-08-17 13:50:46 Syncing #2289059 209b…46ad 0 hdr/s 0+ 0 Qed #2289059 1/50 peers 10 MiB cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2017-08-17 13:50:51 Syncing #2289059 209b…46ad 0 hdr/s 0+ 0 Qed #2289059 1/50 peers 10 MiB cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2017-08-17 13:50:56 Syncing #2289059 209b…46ad 0 hdr/s 0+ 0 Qed #2289059 1/50 peers 10 MiB cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2017-08-17 13:51:01 Syncing #2289059 209b…46ad 0 hdr/s 0+ 0 Qed #2289059 1/50 peers 10 MiB cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2017-08-17 13:51:06 Syncing #2289059 209b…46ad 0 hdr/s 0+ 0 Qed #2289059 1/50 peers 10 MiB cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2017-08-17 13:51:16 Syncing #2289059 209b…46ad 0 hdr/s 0+ 0 Qed #2289059 1/50 peers 10 MiB cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2017-08-17 13:51:21 Syncing #2289059 209b…46ad 0 hdr/s 0+ 0 Qed #2289059 1/50 peers 10 MiB cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2017-08-17 13:51:26 Syncing #2289059 209b…46ad 0 hdr/s 0+ 0 Qed #2289059 1/50 peers 10 MiB cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2017-08-17 13:51:31 Syncing #2289059 209b…46ad 0 hdr/s 0+ 0 Qed #2289059 1/50 peers 10 MiB cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2017-08-17 13:51:36 Syncing #2289059 209b…46ad 0 hdr/s 0+ 0 Qed #2289059 1/50 peers 10 MiB cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2017-08-17 13:51:41 Syncing #2289059 209b…46ad 0 hdr/s 0+ 0 Qed #2289059 1/50 peers 10 MiB cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2017-08-17 13:51:46 Syncing #2289059 209b…46ad 0 hdr/s 0+ 0 Qed #2289059 1/50 peers 10 MiB cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2017-08-17 13:51:51 Syncing #2289059 209b…46ad 0 hdr/s 0+ 0 Qed #2289059 1/50 peers 10 MiB cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2017-08-17 13:51:56 Syncing #2289059 209b…46ad 0 hdr/s 0+ 0 Qed #2289059 1/50 peers 10 MiB cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2017-08-17 13:52:01 Syncing #2289059 209b…46ad 0 hdr/s 0+ 0 Qed #2289059 1/50 peers 10 MiB cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2017-08-17 13:52:06 Syncing #2289059 209b…46ad 0 hdr/s 0+ 0 Qed #2289059 1/50 peers 10 MiB cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2017-08-17 13:52:11 Syncing #2289059 209b…46ad 0 hdr/s 0+ 0 Qed #2289059 1/50 peers 10 MiB cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2017-08-17 13:52:16 Syncing #2289059 209b…46ad 0 hdr/s 0+ 0 Qed #2289059 1/50 peers 10 MiB cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2017-08-17 13:52:21 Syncing #2289059 209b…46ad 0 hdr/s 0+ 0 Qed #2289059 1/50 peers 10 MiB cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2017-08-17 13:52:26 Syncing #2289059 209b…46ad 0 hdr/s 0+ 0 Qed #2289059 1/50 peers 10 MiB cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2017-08-17 13:52:31 Syncing #2289059 209b…46ad 0 hdr/s 0+ 0 Qed #2289059 1/50 peers 10 MiB cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2017-08-17 13:52:36 Syncing #2289059 209b…46ad 0 hdr/s 0+ 0 Qed #2289059 1/50 peers 10 MiB cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2017-08-17 13:52:41 Syncing #2289059 209b…46ad 0 hdr/s 0+ 0 Qed #2289059 1/50 peers 10 MiB cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2017-08-17 13:52:46 Syncing #2289059 209b…46ad 0 hdr/s 0+ 0 Qed #2289059 1/50 peers 10 MiB cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2017-08-17 13:52:51 Syncing #2289059 209b…46ad 0 hdr/s 0+ 0 Qed #2289059 1/50 peers 10 MiB cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2017-08-17 13:52:56 Syncing #2289059 209b…46ad 0 hdr/s 0+ 0 Qed #2289059 2/50 peers 10 MiB cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2017-08-17 13:53:01 Syncing #2289059 209b…46ad 0 hdr/s 0+ 0 Qed #2289059 1/50 peers 10 MiB cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2017-08-17 13:53:06 Syncing #2289059 209b…46ad 0 hdr/s 0+ 0 Qed #2289059 1/50 peers 10 MiB cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2017-08-17 13:53:11 Syncing #2289059 209b…46ad 0 hdr/s 0+ 0 Qed #2289059 2/50 peers 10 MiB cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2017-08-17 13:53:16 Syncing #2289059 209b…46ad 0 hdr/s 0+ 0 Qed #2289059 1/50 peers 10 MiB cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2017-08-17 13:53:21 Syncing #2289059 209b…46ad 0 hdr/s 0+ 0 Qed #2289059 1/50 peers 10 MiB cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2017-08-17 13:53:26 Syncing #2289059 209b…46ad 0 hdr/s 0+ 0 Qed #2289059 1/50 peers 10 MiB cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2017-08-17 13:53:31 Syncing #2289059 209b…46ad 0 hdr/s 0+ 0 Qed #2289059 1/50 peers 10 MiB cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2017-08-17 13:53:36 Syncing #2289059 209b…46ad 0 hdr/s 0+ 0 Qed #2289059 2/50 peers 10 MiB cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2017-08-17 13:53:41 Syncing #2289059 209b…46ad 0 hdr/s 79+ 0 Qed #2289059 3/50 peers 10 MiB cache 76 KiB queue RPC: 0 conn, 0 req/s, 0 µs
2017-08-17 13:53:46 Syncing #2289059 209b…46ad 0 hdr/s 0+ 0 Qed #2289059 2/50 peers 10 MiB cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2017-08-17 13:53:51 Syncing #2289059 209b…46ad 0 hdr/s 139+ 0 Qed #2289059 3/50 peers 10 MiB cache 133 KiB queue RPC: 0 conn, 0 req/s, 0 µs
2017-08-17 13:53:56 Syncing #2289059 209b…46ad 0 hdr/s 0+ 0 Qed #2289059 2/50 peers 10 MiB cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2017-08-17 13:54:01 Syncing #2289059 209b…46ad 0 hdr/s 0+ 0 Qed #2289059 2/50 peers 10 MiB cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2017-08-17 13:54:06 Syncing #2289059 209b…46ad 0 hdr/s 393+ 0 Qed #2289059 2/50 peers 10 MiB cache 342 KiB queue RPC: 0 conn, 0 req/s, 0 µs
2017-08-17 13:54:11 Syncing #2289059 209b…46ad 0 hdr/s 0+ 0 Qed #2289059 1/50 peers 10 MiB cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2017-08-17 13:54:16 Syncing #2289059 209b…46ad 0 hdr/s 0+ 0 Qed #2289059 3/50 peers 10 MiB cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2017-08-17 13:54:21 Syncing #2289059 209b…46ad 0 hdr/s 0+ 0 Qed #2289059 2/50 peers 10 MiB cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2017-08-17 13:54:26 Syncing #2289059 209b…46ad 0 hdr/s 0+ 0 Qed #2289059 2/50 peers 10 MiB cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2017-08-17 13:54:31 Syncing #2289059 209b…46ad 0 hdr/s 0+ 0 Qed #2289059 1/50 peers 10 MiB cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2017-08-17 13:54:36 Syncing #2289059 209b…46ad 0 hdr/s 0+ 0 Qed #2289059 2/50 peers 10 MiB cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2017-08-17 13:54:41 Syncing #2289059 209b…46ad 0 hdr/s 0+ 0 Qed #2289059 2/50 peers 10 MiB cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2017-08-17 13:54:46 Syncing #2289059 209b…46ad 0 hdr/s 0+ 0 Qed #2289059 3/50 peers 10 MiB cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2017-08-17 13:54:51 Syncing #2289059 209b…46ad 0 hdr/s 0+ 0 Qed #2289059 1/50 peers 10 MiB cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2017-08-17 13:54:56 Syncing #2289059 209b…46ad 0 hdr/s 0+ 0 Qed #2289059 1/50 peers 10 MiB cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2017-08-17 13:55:01 Syncing #2289059 209b…46ad 0 hdr/s 0+ 0 Qed #2289059 1/50 peers 10 MiB cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2017-08-17 13:55:06 Syncing #2289059 209b…46ad 0 hdr/s 0+ 0 Qed #2289059 1/50 peers 10 MiB cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2017-08-17 13:55:11 Syncing #2289059 209b…46ad 0 hdr/s 0+ 0 Qed #2289059 2/50 peers 10 MiB cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2017-08-17 13:55:16 Syncing #2289059 209b…46ad 0 hdr/s 0+ 0 Qed #2289059 1/50 peers 10 MiB cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2017-08-17 13:55:21 Syncing #2289059 209b…46ad 0 hdr/s 0+ 0 Qed #2289059 1/50 peers 10 MiB cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2017-08-17 13:55:26 Syncing #2289059 209b…46ad 0 hdr/s 0+ 0 Qed #2289059 1/50 peers 10 MiB cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2017-08-17 13:55:31 Syncing #2289059 209b…46ad 0 hdr/s 0+ 0 Qed #2289059 1/50 peers 10 MiB cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2017-08-17 13:55:36 Syncing #2289059 209b…46ad 0 hdr/s 0+ 0 Qed #2289059 2/50 peers 10 MiB cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2017-08-17 13:55:41 Syncing #2289059 209b…46ad 0 hdr/s 0+ 0 Qed #2289059 1/50 peers 10 MiB cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2017-08-17 13:55:46 Syncing #2289059 209b…46ad 0 hdr/s 0+ 0 Qed #2289059 1/50 peers 10 MiB cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2017-08-17 13:55:51 Syncing #2289059 209b…46ad 0 hdr/s 0+ 0 Qed #2289059 1/50 peers 10 MiB cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2017-08-17 13:55:56 Syncing #2289059 209b…46ad 0 hdr/s 0+ 0 Qed #2289059 1/50 peers 10 MiB cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2017-08-17 13:56:01 Syncing #2289059 209b…46ad 0 hdr/s 0+ 0 Qed #2289059 1/50 peers 10 MiB cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2017-08-17 13:56:06 Syncing #2289059 209b…46ad 0 hdr/s 0+ 0 Qed #2289059 2/50 peers 10 MiB cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2017-08-17 13:56:11 Syncing #2289059 209b…46ad 0 hdr/s 0+ 0 Qed #2289059 1/50 peers 10 MiB cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2017-08-17 13:56:16 Syncing #2289059 209b…46ad 0 hdr/s 0+ 0 Qed #2289059 2/50 peers 10 MiB cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2017-08-17 13:56:21 Syncing #2289059 209b…46ad 0 hdr/s 0+ 0 Qed #2289059 1/50 peers 10 MiB cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2017-08-17 13:56:26 Syncing #2289059 209b…46ad 0 hdr/s 0+ 0 Qed #2289059 1/50 peers 10 MiB cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2017-08-17 13:56:31 Syncing #2289059 209b…46ad 0 hdr/s 0+ 0 Qed #2289059 2/50 peers 10 MiB cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2017-08-17 13:56:36 Syncing #2289059 209b…46ad 0 hdr/s 0+ 0 Qed #2289059 2/50 peers 10 MiB cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2017-08-17 13:56:41 Syncing #2289059 209b…46ad 0 hdr/s 0+ 0 Qed #2289059 1/50 peers 10 MiB cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2017-08-17 13:56:46 Syncing #2289059 209b…46ad 0 hdr/s 0+ 0 Qed #2289059 2/50 peers 10 MiB cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2017-08-17 13:56:51 Syncing #2289059 209b…46ad 0 hdr/s 0+ 0 Qed #2289059 1/50 peers 10 MiB cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2017-08-17 13:56:56 Syncing #2289059 209b…46ad 0 hdr/s 0+ 0 Qed #2289059 1/50 peers 10 MiB cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2017-08-17 13:57:01 Syncing #2289059 209b…46ad 0 hdr/s 0+ 0 Qed #2289059 1/50 peers 10 MiB cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2017-08-17 13:57:06 Syncing #2289059 209b…46ad 0 hdr/s 0+ 0 Qed #2289059 2/50 peers 10 MiB cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2017-08-17 13:57:11 Syncing #2289059 209b…46ad 0 hdr/s 0+ 0 Qed #2289059 1/50 peers 10 MiB cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2017-08-17 13:57:16 Syncing #2289059 209b…46ad 0 hdr/s 0+ 0 Qed #2289059 1/50 peers 10 MiB cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2017-08-17 13:57:21 Syncing #2289059 209b…46ad 0 hdr/s 0+ 0 Qed #2289059 1/50 peers 10 MiB cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2017-08-17 13:57:26 Syncing #2289059 209b…46ad 0 hdr/s 0+ 0 Qed #2289059 1/50 peers 10 MiB cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2017-08-17 13:57:31 Syncing #2289059 209b…46ad 0 hdr/s 0+ 0 Qed #2289059 1/50 peers 10 MiB cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2017-08-17 13:57:36 Syncing #2289059 209b…46ad 0 hdr/s 0+ 0 Qed #2289059 1/50 peers 10 MiB cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2017-08-17 13:57:41 Syncing #2289059 209b…46ad 0 hdr/s 0+ 0 Qed #2289059 1/50 peers 10 MiB cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2017-08-17 13:57:46 Syncing #2289059 209b…46ad 0 hdr/s 0+ 0 Qed #2289059 1/50 peers 10 MiB cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2017-08-17 13:57:51 Syncing #2289059 209b…46ad 0 hdr/s 0+ 0 Qed #2289059 1/50 peers 10 MiB cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
Same here, different block number.(macOS 10.12.6)
could be simply that request credits are running out. "pip=trace" logs would be helpful, although it would produce a lot. the last 30 seconds would be enough in any case.
Having the exact same problem. Constantly stalls and will remain stalled until I kill and restart the client.
FWIW I had the same problem but it went away when compiling and running the latest master.
@Robzz I just tried with the latest commit on master running on Ubuntu and it still stalls during sync.
+1 for Parity//v1.7.0-beta-5f2cabd-20170727/x86_64-macos/rustc1.18.0
+1 for Parity//v1.7.2-beta-9f47909-20170918/x86_64-linux-gnu/rustc1.19.0
laptop went to sleep. came back and its stuck at 1938838. restarting doesn't change anything. how should I enable better logging to help debug?
Ran parity inside docker (which uses the 1.7.2 deb) with --logging pip=trace --light --reserved-only and a reserved peer file that points to my recently setup archival node
https://gist.github.com/WyseNynja/30fd6df9ca726604043a452e5bc525fc
If I remove --light from the command, the container starts syncing fine.
I have grafana setup for my archival node and so have a graph of upload bandwidth for the parity container:
Started a light client from scratch with pip=trace logging and --reserved-only pointed at my archival node: https://gist.github.com/WyseNynja/4a956ca5e3a5a8b24be3aa0376eff030
EDIT 1 (12:15 PM PST): It stalled a couple times so far, but it resumes after a few minutes.
EDIT 2 (4:15 PM PST): Sigh. Of course with logging on, it it was able to sync fully
My node got stuck again. Pretty useless logs.
2017-09-25 18:32:48 UTC IO Worker #1 TRACE pip Sending status to peer 0
2017-09-25 18:32:48 UTC IO Worker #1 TRACE pip Peer 0 disconnecting
2017-09-25 18:32:49 UTC IO Worker #1 TRACE pip Sending status to peer 0
2017-09-25 18:32:49 UTC IO Worker #0 TRACE pip Peer 0 disconnecting
2017-09-25 18:32:49 UTC IO Worker #1 DEBUG pip Error sending packet to peer 0: Network error (Expired message)
2017-09-25 18:32:51 UTC IO Worker #3 TRACE pip Sending status to peer 0
2017-09-25 18:32:51 UTC IO Worker #0 TRACE pip Peer 0 disconnecting
2017-09-25 18:32:52 UTC IO Worker #1 TRACE pip Sending status to peer 0
2017-09-25 18:32:52 UTC IO Worker #0 TRACE pip Peer 0 disconnecting
2017-09-25 18:32:53 UTC IO Worker #1 TRACE pip Sending status to peer 0
2017-09-25 18:32:53 UTC IO Worker #3 TRACE pip Peer 0 disconnecting
2017-09-25 18:32:55 UTC IO Worker #1 TRACE pip Sending status to peer 0
2017-09-25 18:32:55 UTC IO Worker #3 TRACE pip Peer 0 disconnecting
2017-09-25 18:32:55 UTC IO Worker #1 TRACE pip Sending status to peer 0
2017-09-25 18:32:55 UTC IO Worker #2 TRACE pip Peer 0 disconnecting
2017-09-25 18:32:56 UTC IO Worker #2 TRACE pip Sending status to peer 0
2017-09-25 18:32:56 UTC IO Worker #1 TRACE pip Peer 0 disconnecting
2017-09-25 18:32:57 UTC IO Worker #3 TRACE pip Sending status to peer 0
2017-09-25 18:32:57 UTC IO Worker #2 TRACE pip Peer 0 disconnecting
2017-09-25 18:32:57 UTC IO Worker #3 DEBUG pip Error sending packet to peer 0: Network error (Expired message)
2017-09-25 18:32:59 UTC IO Worker #0 TRACE pip Sending status to peer 0
2017-09-25 18:32:59 UTC IO Worker #2 TRACE pip Peer 0 disconnecting
2017-09-25 18:32:59 UTC IO Worker #0 TRACE pip Sending status to peer 1
2017-09-25 18:32:59 UTC IO Worker #3 TRACE pip Peer 1 disconnecting
2017-09-25 18:32:59 UTC IO Worker #0 DEBUG pip Error sending packet to peer 1: Network error (Expired message)
2017-09-25 18:33:00 UTC IO Worker #1 TRACE pip Sending status to peer 1
2017-09-25 18:33:00 UTC IO Worker #2 TRACE pip Peer 1 disconnecting
2017-09-25 18:33:02 UTC IO Worker #1 TRACE pip Sending status to peer 1
2017-09-25 18:33:02 UTC IO Worker #3 TRACE pip Peer 1 disconnecting
2017-09-25 18:33:02 UTC IO Worker #1 TRACE pip Sending status to peer 1
2017-09-25 18:33:02 UTC IO Worker #3 TRACE pip Peer 1 disconnecting
2017-09-25 18:33:03 UTC IO Worker #2 TRACE pip Sending status to peer 1
2017-09-25 18:33:03 UTC IO Worker #1 TRACE pip Peer 1 disconnecting
2017-09-25 18:33:03 UTC IO Worker #2 DEBUG pip Error sending packet to peer 1: Network error (Expired message)
2017-09-25 18:33:04 UTC IO Worker #0 TRACE pip Sending status to peer 1
2017-09-25 18:33:04 UTC IO Worker #2 TRACE pip Peer 1 disconnecting
2017-09-25 18:33:06 UTC IO Worker #3 TRACE pip Sending status to peer 1
2017-09-25 18:33:06 UTC IO Worker #0 TRACE pip Peer 1 disconnecting
2017-09-25 18:33:07 UTC IO Worker #0 TRACE pip Sending status to peer 1
2017-09-25 18:33:07 UTC IO Worker #1 TRACE pip Peer 1 disconnecting
2017-09-25 18:33:09 UTC IO Worker #0 TRACE pip Sending status to peer 1
2017-09-25 18:33:09 UTC IO Worker #3 TRACE pip Peer 1 disconnecting
2017-09-25 18:33:10 UTC IO Worker #1 TRACE pip Sending status to peer 1
2017-09-25 18:33:10 UTC IO Worker #3 TRACE pip Peer 1 disconnecting
2017-09-25 18:33:10 UTC IO Worker #1 DEBUG pip Error sending packet to peer 1: Network error (Expired message)
2017-09-25 18:33:10 UTC IO Worker #3 TRACE pip Sending status to peer 1
2017-09-25 18:33:11 UTC IO Worker #0 TRACE pip Peer 1 disconnecting
2017-09-25 18:33:12 UTC IO Worker #3 TRACE pip Sending status to peer 0
2017-09-25 18:33:12 UTC IO Worker #2 TRACE pip Peer 0 disconnecting
2017-09-25 18:33:12 UTC IO Worker #2 TRACE pip Sending status to peer 0
2017-09-25 18:33:12 UTC IO Worker #2 TRACE pip Peer 0 disconnecting
2017-09-25 18:33:13 UTC IO Worker #2 INFO import 0/50 peers 664 bytes cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2017-09-25 18:33:15 UTC IO Worker #0 TRACE pip Sending status to peer 0
2017-09-25 18:33:15 UTC IO Worker #2 TRACE pip Peer 0 disconnecting
2017-09-25 18:33:15 UTC IO Worker #0 DEBUG pip Error sending packet to peer 0: Network error (Expired message)
2017-09-25 18:33:15 UTC IO Worker #3 TRACE pip Sending status to peer 0
2017-09-25 18:33:16 UTC IO Worker #0 TRACE pip Peer 0 disconnecting
2017-09-25 18:33:17 UTC IO Worker #2 TRACE pip Sending status to peer 0
2017-09-25 18:33:17 UTC IO Worker #1 TRACE pip Peer 0 disconnecting
2017-09-25 18:33:21 UTC IO Worker #1 TRACE pip Sending status to peer 1
2017-09-25 18:33:22 UTC IO Worker #1 TRACE pip Peer 1 disconnecting
2017-09-25 18:33:23 UTC IO Worker #0 TRACE pip Sending status to peer 1
2017-09-25 18:33:23 UTC IO Worker #1 TRACE pip Peer 1 disconnecting
2017-09-25 18:33:27 UTC IO Worker #2 TRACE pip Sending status to peer 1
2017-09-25 18:33:27 UTC IO Worker #1 TRACE pip Peer 1 disconnecting
2017-09-25 18:33:28 UTC IO Worker #3 TRACE pip Sending status to peer 2
2017-09-25 18:33:28 UTC IO Worker #2 TRACE pip Peer 2 disconnecting
2017-09-25 18:33:29 UTC IO Worker #3 TRACE pip Sending status to peer 2
2017-09-25 18:33:29 UTC IO Worker #1 TRACE pip Peer 2 disconnecting
2017-09-25 18:33:31 UTC IO Worker #0 TRACE pip Sending status to peer 2
2017-09-25 18:33:31 UTC IO Worker #1 TRACE pip Peer 2 disconnecting
2017-09-25 18:33:32 UTC IO Worker #3 TRACE pip Sending status to peer 1
2017-09-25 18:33:32 UTC IO Worker #3 TRACE pip Peer 1 disconnecting
2017-09-25 18:33:33 UTC IO Worker #0 TRACE pip Sending status to peer 2
2017-09-25 18:33:33 UTC IO Worker #3 TRACE pip Peer 2 disconnecting
2017-09-25 18:33:33 UTC IO Worker #0 DEBUG pip Error sending packet to peer 2: Network error (Expired message)
2017-09-25 18:33:34 UTC IO Worker #1 TRACE pip Sending status to peer 2
2017-09-25 18:33:34 UTC IO Worker #3 TRACE pip Peer 2 disconnecting
2017-09-25 18:33:35 UTC IO Worker #1 TRACE pip Sending status to peer 2
2017-09-25 18:33:35 UTC IO Worker #3 TRACE pip Peer 2 disconnecting
2017-09-25 18:33:37 UTC IO Worker #3 TRACE pip Sending status to peer 2
2017-09-25 18:33:37 UTC IO Worker #3 TRACE pip Peer 2 disconnecting
2017-09-25 18:33:38 UTC IO Worker #3 TRACE pip Sending status to peer 2
2017-09-25 18:33:38 UTC IO Worker #1 TRACE pip Peer 2 disconnecting
2017-09-25 18:33:38 UTC IO Worker #3 DEBUG pip Error sending packet to peer 2: Network error (Expired message)
2017-09-25 18:33:39 UTC IO Worker #1 TRACE pip Sending status to peer 2
2017-09-25 18:33:39 UTC IO Worker #1 TRACE pip Peer 2 disconnecting
2017-09-25 18:33:40 UTC IO Worker #0 TRACE pip Sending status to peer 2
2017-09-25 18:33:40 UTC IO Worker #3 TRACE pip Peer 2 disconnecting
2017-09-25 18:33:41 UTC IO Worker #1 TRACE pip Sending status to peer 2
2017-09-25 18:33:41 UTC IO Worker #3 TRACE pip Peer 2 disconnecting
2017-09-25 18:33:41 UTC IO Worker #1 DEBUG pip Error sending packet to peer 2: Network error (Expired message)
2017-09-25 18:33:42 UTC IO Worker #3 TRACE pip Sending status to peer 2
2017-09-25 18:33:42 UTC IO Worker #3 TRACE pip Peer 2 disconnecting
2017-09-25 18:33:43 UTC IO Worker #2 TRACE pip Sending status to peer 2
2017-09-25 18:33:43 UTC IO Worker #1 TRACE pip Peer 2 disconnecting
2017-09-25 18:33:43 UTC IO Worker #1 INFO import 0/50 peers 664 bytes cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2017-09-25 18:33:44 UTC IO Worker #3 TRACE pip Sending status to peer 2
2017-09-25 18:33:44 UTC IO Worker #1 TRACE pip Peer 2 disconnecting
2017-09-25 18:33:46 UTC IO Worker #0 TRACE pip Sending status to peer 2
2017-09-25 18:33:46 UTC IO Worker #1 TRACE pip Peer 2 disconnecting
2017-09-25 18:33:46 UTC IO Worker #0 DEBUG pip Error sending packet to peer 2: Network error (Expired message)
2017-09-25 18:33:46 UTC IO Worker #1 TRACE pip Sending status to peer 2
2017-09-25 18:33:46 UTC IO Worker #1 TRACE pip Peer 2 disconnecting
2017-09-25 18:33:47 UTC IO Worker #3 TRACE pip Sending status to peer 2
2017-09-25 18:33:47 UTC IO Worker #2 TRACE pip Peer 2 disconnecting
2017-09-25 18:33:48 UTC IO Worker #3 TRACE pip Sending status to peer 2
2017-09-25 18:33:48 UTC IO Worker #2 TRACE pip Peer 2 disconnecting
2017-09-25 18:33:49 UTC IO Worker #2 TRACE pip Sending status to peer 2
2017-09-25 18:33:49 UTC IO Worker #2 DEBUG pip Error sending packet to peer 2: Network error (Expired message)
2017-09-25 18:33:49 UTC IO Worker #1 TRACE pip Peer 2 disconnecting
2017-09-25 18:33:50 UTC IO Worker #3 TRACE pip Sending status to peer 2
2017-09-25 18:33:50 UTC IO Worker #0 TRACE pip Peer 2 disconnecting
2017-09-25 18:33:51 UTC IO Worker #3 TRACE pip Sending status to peer 2
2017-09-25 18:33:51 UTC IO Worker #0 TRACE pip Peer 2 disconnecting
2017-09-25 18:33:51 UTC IO Worker #3 DEBUG pip Error sending packet to peer 2: Network error (Expired message)
2017-09-25 18:33:52 UTC IO Worker #1 TRACE pip Sending status to peer 2
2017-09-25 18:33:52 UTC IO Worker #2 TRACE pip Peer 2 disconnecting
2017-09-25 18:33:52 UTC IO Worker #1 DEBUG pip Error sending packet to peer 2: Network error (Expired message)
2017-09-25 18:33:53 UTC IO Worker #3 TRACE pip Sending status to peer 2
2017-09-25 18:33:53 UTC IO Worker #1 TRACE pip Peer 2 disconnecting
2017-09-25 18:33:53 UTC IO Worker #3 DEBUG pip Error sending packet to peer 2: Network error (Expired message)
2017-09-25 18:33:54 UTC IO Worker #3 TRACE pip Sending status to peer 2
2017-09-25 18:33:54 UTC IO Worker #3 TRACE pip Peer 2 disconnecting
2017-09-25 18:33:55 UTC IO Worker #2 TRACE pip Sending status to peer 2
2017-09-25 18:33:55 UTC IO Worker #2 TRACE pip Peer 2 disconnecting
2017-09-25 18:33:56 UTC IO Worker #3 TRACE pip Sending status to peer 2
2017-09-25 18:33:56 UTC IO Worker #2 TRACE pip Peer 2 disconnecting
2017-09-25 18:33:57 UTC IO Worker #2 TRACE pip Sending status to peer 2
2017-09-25 18:33:57 UTC IO Worker #3 TRACE pip Peer 2 disconnecting
After a few minutes of that it started again. What logging should I enable on the other side? My guess is @rphmeier is right about request credits simply running out. I would have expected a reserved peer to not be able to run out of credits though. And without --reserved-peers it still seems to stall sometimes.
@WyseNynja Currently there isn't any logic that gives reserved peers infinite credits, although that would definitely be a useful addition! In general, it can be pretty hard to find peers to serve light client data, because many do not. An upgraded peer discovery protocol could alleviate that for sure.
I was able to light-sync musicoin with latest master and the sync was also stuck at some point. I was connected to my own archive node (nobody else is running a light-serving node on musicoin), and after shutting down the light client, restarting the archival node, and restarting the light client, it was able to finish the sync.
Minor annoyance: after completing the sync, the logs do not show the latest block number, so I can only verify it's finished via RPC/WS or UI.
2017-09-26 10:23:14 Syncing #1052064 9add…8280 234 hdr/s 29932+ 0 Qed #1052064 3/50 peers 10 MiB cache 25 MiB queue RPC: 0 conn, 0 req/s, 0 µs
2017-09-26 10:23:19 Syncing #1053867 e19b…6e46 360 hdr/s 30049+ 0 Qed #1053867 5/50 peers 10 MiB cache 25 MiB queue RPC: 0 conn, 0 req/s, 0 µs
2017-09-26 10:23:24 Syncing #1055703 9dcb…765c 367 hdr/s 30001+ 0 Qed #1055703 4/50 peers 10 MiB cache 25 MiB queue RPC: 0 conn, 0 req/s, 0 µs
2017-09-26 10:23:29 Syncing #1057460 07ca…a90f 350 hdr/s 30040+ 0 Qed #1057460 4/50 peers 10 MiB cache 25 MiB queue RPC: 0 conn, 0 req/s, 0 µs
2017-09-26 10:23:34 Syncing #1059004 6015…35fd 308 hdr/s 30028+ 0 Qed #1059004 4/50 peers 10 MiB cache 25 MiB queue RPC: 0 conn, 0 req/s, 0 µs
2017-09-26 10:23:39 Syncing #1060579 3af9…e2e8 315 hdr/s 29975+ 0 Qed #1060579 4/50 peers 10 MiB cache 25 MiB queue RPC: 0 conn, 0 req/s, 0 µs
2017-09-26 10:23:44 Syncing #1062415 9369…3364 367 hdr/s 30076+ 0 Qed #1062415 4/50 peers 10 MiB cache 25 MiB queue RPC: 0 conn, 0 req/s, 0 µs
2017-09-26 10:23:49 Syncing #1064215 4adf…a79c 360 hdr/s 30069+ 0 Qed #1064215 4/50 peers 10 MiB cache 25 MiB queue RPC: 0 conn, 0 req/s, 0 µs
2017-09-26 10:23:54 Syncing #1065744 ee66…a090 305 hdr/s 29946+ 0 Qed #1065744 4/50 peers 10 MiB cache 25 MiB queue RPC: 0 conn, 0 req/s, 0 µs
2017-09-26 10:23:59 Syncing #1067333 d80f…8e2f 317 hdr/s 30021+ 0 Qed #1067333 4/50 peers 10 MiB cache 25 MiB queue RPC: 0 conn, 0 req/s, 0 µs
2017-09-26 10:24:04 Syncing #1068946 f54b…0aed 322 hdr/s 29945+ 0 Qed #1068946 4/50 peers 10 MiB cache 25 MiB queue RPC: 0 conn, 0 req/s, 0 µs
2017-09-26 10:24:09 Syncing #1070651 59bf…3ab4 341 hdr/s 30034+ 0 Qed #1070651 4/50 peers 10 MiB cache 25 MiB queue RPC: 0 conn, 0 req/s, 0 µs
2017-09-26 10:24:14 Syncing #1072198 5f48…4f7d 309 hdr/s 30018+ 0 Qed #1072198 5/50 peers 10 MiB cache 25 MiB queue RPC: 0 conn, 0 req/s, 0 µs
2017-09-26 10:24:19 Syncing #1074067 bd40…d861 373 hdr/s 29944+ 0 Qed #1074067 4/50 peers 10 MiB cache 25 MiB queue RPC: 0 conn, 0 req/s, 0 µs
2017-09-26 10:24:24 Syncing #1075814 7dca…9b40 349 hdr/s 29990+ 0 Qed #1075814 4/50 peers 10 MiB cache 25 MiB queue RPC: 0 conn, 0 req/s, 0 µs
2017-09-26 10:24:29 Syncing #1077809 03ee…f706 398 hdr/s 30042+ 0 Qed #1077809 4/50 peers 10 MiB cache 25 MiB queue RPC: 0 conn, 0 req/s, 0 µs
2017-09-26 10:24:34 Syncing #1079777 e742…c02a 392 hdr/s 29992+ 0 Qed #1079777 4/50 peers 10 MiB cache 25 MiB queue RPC: 0 conn, 0 req/s, 0 µs
2017-09-26 10:24:39 Syncing #1081154 f9ea…61b9 275 hdr/s 30024+ 0 Qed #1081154 4/50 peers 10 MiB cache 25 MiB queue RPC: 0 conn, 0 req/s, 0 µs
2017-09-26 10:24:44 Syncing #1083252 74d5…3c97 419 hdr/s 29721+ 0 Qed #1083252 4/50 peers 10 MiB cache 25 MiB queue RPC: 0 conn, 0 req/s, 0 µs
2017-09-26 10:24:49 Syncing #1085284 673d…69c8 406 hdr/s 27685+ 0 Qed #1085284 4/50 peers 10 MiB cache 23 MiB queue RPC: 0 conn, 0 req/s, 0 µs
2017-09-26 10:24:54 Syncing #1086918 d987…a6ec 326 hdr/s 26054+ 0 Qed #1086918 4/50 peers 10 MiB cache 22 MiB queue RPC: 0 conn, 0 req/s, 0 µs
2017-09-26 10:24:59 Syncing #1088425 a47b…de0e 301 hdr/s 24546+ 0 Qed #1088425 4/50 peers 10 MiB cache 21 MiB queue RPC: 0 conn, 0 req/s, 0 µs
2017-09-26 10:25:04 Syncing #1089938 debd…2fb3 302 hdr/s 23031+ 0 Qed #1089938 4/50 peers 10 MiB cache 19 MiB queue RPC: 0 conn, 0 req/s, 0 µs
2017-09-26 10:25:09 Syncing #1091431 9cc4…3213 298 hdr/s 21542+ 0 Qed #1091431 4/50 peers 10 MiB cache 18 MiB queue RPC: 0 conn, 0 req/s, 0 µs
2017-09-26 10:25:14 Syncing #1092861 1612…f8bf 285 hdr/s 20109+ 0 Qed #1092861 4/50 peers 10 MiB cache 17 MiB queue RPC: 0 conn, 0 req/s, 0 µs
2017-09-26 10:25:19 Syncing #1094170 3e47…ca2d 261 hdr/s 18800+ 0 Qed #1094170 4/50 peers 10 MiB cache 16 MiB queue RPC: 0 conn, 0 req/s, 0 µs
2017-09-26 10:25:24 Syncing #1095436 2c44…2111 252 hdr/s 17537+ 0 Qed #1095436 4/50 peers 10 MiB cache 15 MiB queue RPC: 0 conn, 0 req/s, 0 µs
2017-09-26 10:25:29 Syncing #1096645 1be2…ee1f 241 hdr/s 16328+ 0 Qed #1096645 4/50 peers 10 MiB cache 14 MiB queue RPC: 0 conn, 0 req/s, 0 µs
2017-09-26 10:25:34 Syncing #1097658 f587…0f45 202 hdr/s 15312+ 0 Qed #1097658 4/50 peers 10 MiB cache 13 MiB queue RPC: 0 conn, 0 req/s, 0 µs
2017-09-26 10:25:39 Syncing #1098763 b015…9dad 220 hdr/s 14209+ 0 Qed #1098763 4/50 peers 10 MiB cache 12 MiB queue RPC: 0 conn, 0 req/s, 0 µs
2017-09-26 10:25:44 Syncing #1099921 55b6…89b9 231 hdr/s 13052+ 0 Qed #1099921 4/50 peers 10 MiB cache 11 MiB queue RPC: 0 conn, 0 req/s, 0 µs
2017-09-26 10:25:49 Syncing #1101119 cf75…163e 239 hdr/s 11854+ 0 Qed #1101119 4/50 peers 10 MiB cache 10 MiB queue RPC: 0 conn, 0 req/s, 0 µs
2017-09-26 10:25:54 Syncing #1102190 9f8f…d8ed 213 hdr/s 10783+ 0 Qed #1102190 4/50 peers 10 MiB cache 9 MiB queue RPC: 0 conn, 0 req/s, 0 µs
2017-09-26 10:25:59 Syncing #1103250 796a…85ad 211 hdr/s 9723+ 1 Qed #1103250 4/50 peers 10 MiB cache 8 MiB queue RPC: 0 conn, 0 req/s, 0 µs
2017-09-26 10:26:04 Syncing #1104237 9f90…6314 197 hdr/s 8735+ 0 Qed #1104237 4/50 peers 10 MiB cache 7 MiB queue RPC: 0 conn, 0 req/s, 0 µs
2017-09-26 10:26:09 Syncing #1105532 3868…662c 259 hdr/s 7441+ 0 Qed #1105532 4/50 peers 10 MiB cache 6 MiB queue RPC: 0 conn, 0 req/s, 0 µs
2017-09-26 10:26:14 Syncing #1106587 0cf0…a7fe 210 hdr/s 6386+ 0 Qed #1106587 4/50 peers 10 MiB cache 5 MiB queue RPC: 0 conn, 0 req/s, 0 µs
2017-09-26 10:26:19 Syncing #1107696 2b9a…2dd9 221 hdr/s 5276+ 0 Qed #1107696 6/50 peers 10 MiB cache 4 MiB queue RPC: 0 conn, 0 req/s, 0 µs
2017-09-26 10:26:24 Syncing #1108686 5d1a…5bdd 197 hdr/s 4285+ 0 Qed #1108686 4/50 peers 10 MiB cache 4 MiB queue RPC: 0 conn, 0 req/s, 0 µs
2017-09-26 10:26:29 Syncing #1109577 bab1…4d5a 178 hdr/s 3394+ 0 Qed #1109577 4/50 peers 10 MiB cache 3 MiB queue RPC: 0 conn, 0 req/s, 0 µs
2017-09-26 10:26:34 Syncing #1110036 b816…c253 91 hdr/s 2934+ 0 Qed #1110036 4/50 peers 10 MiB cache 2 MiB queue RPC: 0 conn, 0 req/s, 0 µs
2017-09-26 10:26:39 Syncing #1111164 f4ca…b1c1 225 hdr/s 1809+ 0 Qed #1111164 4/50 peers 10 MiB cache 2 MiB queue RPC: 0 conn, 0 req/s, 0 µs
2017-09-26 10:26:44 Syncing #1112306 7782…5092 228 hdr/s 666+ 0 Qed #1112306 4/50 peers 10 MiB cache 576 KiB queue RPC: 0 conn, 0 req/s, 0 µs
2017-09-26 10:26:49 Syncing #1112977 1af1…ce3c 134 hdr/s 0+ 0 Qed #1112977 4/50 peers 10 MiB cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2017-09-26 10:26:54 Syncing #1112977 1af1…ce3c 0 hdr/s 0+ 0 Qed #1112977 4/50 peers 10 MiB cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2017-09-26 10:26:59 Syncing #1112977 1af1…ce3c 0 hdr/s 0+ 0 Qed #1112977 4/50 peers 10 MiB cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2017-09-26 10:27:04 Syncing #1112977 1af1…ce3c 0 hdr/s 0+ 0 Qed #1112977 5/50 peers 10 MiB cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2017-09-26 10:27:09 Syncing #1112977 1af1…ce3c 0 hdr/s 0+ 0 Qed #1112977 4/50 peers 10 MiB cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2017-09-26 10:27:14 Syncing #1112977 1af1…ce3c 0 hdr/s 0+ 0 Qed #1112977 4/50 peers 10 MiB cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2017-09-26 10:27:19 Syncing #1112977 1af1…ce3c 0 hdr/s 0+ 0 Qed #1112977 4/50 peers 10 MiB cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2017-09-26 10:27:24 Syncing #1112977 1af1…ce3c 0 hdr/s 0+ 0 Qed #1112977 4/50 peers 10 MiB cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2017-09-26 10:27:29 Syncing #1112977 1af1…ce3c 0 hdr/s 0+ 0 Qed #1112977 4/50 peers 10 MiB cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2017-09-26 10:27:34 Syncing #1112977 1af1…ce3c 0 hdr/s 0+ 0 Qed #1112977 4/50 peers 10 MiB cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2017-09-26 10:27:39 Syncing #1112977 1af1…ce3c 0 hdr/s 0+ 0 Qed #1112977 4/50 peers 10 MiB cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2017-09-26 10:27:44 Syncing #1112977 1af1…ce3c 0 hdr/s 0+ 0 Qed #1112977 4/50 peers 10 MiB cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2017-09-26 10:27:49 Syncing #1112977 1af1…ce3c 0 hdr/s 0+ 0 Qed #1112977 4/50 peers 10 MiB cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2017-09-26 10:27:54 Syncing #1112977 1af1…ce3c 0 hdr/s 0+ 0 Qed #1112977 4/50 peers 10 MiB cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2017-09-26 10:27:59 Syncing #1112977 1af1…ce3c 0 hdr/s 0+ 0 Qed #1112977 4/50 peers 10 MiB cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2017-09-26 10:28:04 Syncing #1112977 1af1…ce3c 0 hdr/s 0+ 0 Qed #1112977 4/50 peers 10 MiB cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2017-09-26 10:28:09 Syncing #1112977 1af1…ce3c 0 hdr/s 0+ 0 Qed #1112977 4/50 peers 10 MiB cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
^C
0 ✓ user@eros ~ $ ~/.opt/parity/target/release/parity --light --base-path /home/user/.local/share/io.parity.ethereum-08 --chain musicoin --ports-shift 830 --reserved-peers ~/Desktop/reserved-smc.txt
2017-09-26 10:31:19 Starting Parity/v1.8.0-unstable-7940bf6ec-20170921/x86_64-linux-gnu/rustc1.19.0
2017-09-26 10:31:19 Keys path /home/user/.local/share/io.parity.ethereum-08/keys/Musicoin
2017-09-26 10:31:19 DB path /home/user/.local/share/io.parity.ethereum-08/chains/Musicoin/db/a9974ec0e92bf923
2017-09-26 10:31:19 Path to dapps /home/user/.local/share/io.parity.ethereum-08/dapps
2017-09-26 10:31:19 Running in experimental Light Client mode.
2017-09-26 10:31:25 Public node URL: enode://54993a439d6ecc96c7724111dba9e45a2dba3e9a9f44bad45bcb1acae780878b1192983e4d33a211f5025d9f48c0f32f7b88e6cf9fcbddf8d6caac0085e4daac@172.16.16.11:31133
2017-09-26 10:31:25 Syncing #1113925 6ae3…2a05 189 hdr/s 4945+ 0 Qed #1113925 4/50 peers 615 KiB cache 4 MiB queue RPC: 0 conn, 0 req/s, 0 µs
2017-09-26 10:31:30 Syncing #1115041 5491…cb4a 222 hdr/s 3831+ 0 Qed #1115041 5/50 peers 1 MiB cache 3 MiB queue RPC: 0 conn, 0 req/s, 0 µs
2017-09-26 10:31:35 Syncing #1116472 6c22…7d0d 286 hdr/s 2402+ 0 Qed #1116472 5/50 peers 2 MiB cache 2 MiB queue RPC: 0 conn, 0 req/s, 0 µs
2017-09-26 10:31:40 Syncing #1117889 5a2a…8b15 283 hdr/s 987+ 0 Qed #1117889 5/50 peers 3 MiB cache 851 KiB queue RPC: 0 conn, 0 req/s, 0 µs
2017-09-26 10:32:10 4/50 peers 4 MiB cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2017-09-26 10:32:40 4/50 peers 4 MiB cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2017-09-26 10:33:10 4/50 peers 4 MiB cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2017-09-26 10:33:40 4/50 peers 4 MiB cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2017-09-26 10:34:10 4/50 peers 4 MiB cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2017-09-26 10:34:40 4/50 peers 4 MiB cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2017-09-26 10:35:10 4/50 peers 4 MiB cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2017-09-26 10:35:40 1/50 peers 4 MiB cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2017-09-26 10:36:10 1/50 peers 4 MiB cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
Edit: Just finished a light sync on expanse without any issues.
Very interesting, now being aware of the credits, I just managed to get a stuck foundation light-sync to continue after changing the --node-key
2017-09-26 10:46:12 Syncing #2120393 b092…e206 483 hdr/s 28094+ 0 Qed #2120393 21/50 peers 10 MiB cache 24 MiB queue RPC: 0 conn, 0 req/s, 0 µs
2017-09-26 10:46:17 Syncing #2123054 c757…f8df 532 hdr/s 27867+ 0 Qed #2123054 21/50 peers 10 MiB cache 23 MiB queue RPC: 0 conn, 0 req/s, 0 µs
2017-09-26 10:46:22 Syncing #2127429 0b17…3df8 875 hdr/s 30018+ 0 Qed #2127429 21/50 peers 10 MiB cache 25 MiB queue RPC: 0 conn, 0 req/s, 0 µs
2017-09-26 10:46:27 Syncing #2131181 4d1b…0b5a 750 hdr/s 29596+ 0 Qed #2131181 21/50 peers 10 MiB cache 25 MiB queue RPC: 0 conn, 0 req/s, 0 µs
2017-09-26 10:46:32 Syncing #2135728 c25d…2cba 909 hdr/s 26327+ 0 Qed #2135728 21/50 peers 10 MiB cache 22 MiB queue RPC: 0 conn, 0 req/s, 0 µs
2017-09-26 10:46:37 Syncing #2139740 ad01…e10c 802 hdr/s 29869+ 0 Qed #2139740 21/50 peers 10 MiB cache 25 MiB queue RPC: 0 conn, 0 req/s, 0 µs
2017-09-26 10:46:42 Syncing #2143242 e435…a6b5 700 hdr/s 29183+ 0 Qed #2143242 20/50 peers 10 MiB cache 24 MiB queue RPC: 0 conn, 0 req/s, 0 µs
2017-09-26 10:46:47 Syncing #2145662 c601…5103 484 hdr/s 28171+ 0 Qed #2145662 20/50 peers 10 MiB cache 24 MiB queue RPC: 0 conn, 0 req/s, 0 µs
2017-09-26 10:46:52 Syncing #2148678 cd3f…260c 603 hdr/s 28227+ 0 Qed #2148678 20/50 peers 10 MiB cache 24 MiB queue RPC: 0 conn, 0 req/s, 0 µs
2017-09-26 10:46:57 Syncing #2152331 f85c…6108 730 hdr/s 28798+ 0 Qed #2152331 20/50 peers 10 MiB cache 24 MiB queue RPC: 0 conn, 0 req/s, 0 µs
2017-09-26 10:47:02 Syncing #2155526 48cf…88e9 639 hdr/s 29955+ 0 Qed #2155526 20/50 peers 10 MiB cache 25 MiB queue RPC: 0 conn, 0 req/s, 0 µs
2017-09-26 10:47:07 Syncing #2159971 3949…3991 889 hdr/s 26406+ 0 Qed #2159971 20/50 peers 10 MiB cache 22 MiB queue RPC: 0 conn, 0 req/s, 0 µs
2017-09-26 10:47:12 Syncing #2163294 82af…c189 664 hdr/s 23080+ 0 Qed #2163294 18/50 peers 10 MiB cache 19 MiB queue RPC: 0 conn, 0 req/s, 0 µs
2017-09-26 10:47:17 Syncing #2166480 399b…1dbb 637 hdr/s 19896+ 0 Qed #2166480 18/50 peers 10 MiB cache 17 MiB queue RPC: 0 conn, 0 req/s, 0 µs
2017-09-26 10:47:22 Syncing #2169538 0da2…cab8 611 hdr/s 16841+ 0 Qed #2169538 18/50 peers 10 MiB cache 14 MiB queue RPC: 0 conn, 0 req/s, 0 µs
2017-09-26 10:47:27 Syncing #2173759 b112…e650 844 hdr/s 12619+ 0 Qed #2173759 18/50 peers 10 MiB cache 11 MiB queue RPC: 0 conn, 0 req/s, 0 µs
2017-09-26 10:47:32 Syncing #2178453 cdb1…4718 938 hdr/s 7925+ 0 Qed #2178453 18/50 peers 10 MiB cache 7 MiB queue RPC: 0 conn, 0 req/s, 0 µs
2017-09-26 10:47:37 Syncing #2183165 db03…4546 942 hdr/s 3213+ 0 Qed #2183165 18/50 peers 10 MiB cache 3 MiB queue RPC: 0 conn, 0 req/s, 0 µs
2017-09-26 10:47:42 Syncing #2186382 6843…e47c 643 hdr/s 0+ 0 Qed #2186382 18/50 peers 10 MiB cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2017-09-26 10:47:47 Syncing #2186382 6843…e47c 0 hdr/s 0+ 0 Qed #2186382 19/50 peers 10 MiB cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2017-09-26 10:47:57 Syncing #2186382 6843…e47c 0 hdr/s 0+ 0 Qed #2186382 19/50 peers 10 MiB cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2017-09-26 10:48:02 Syncing #2186382 6843…e47c 0 hdr/s 0+ 0 Qed #2186382 19/50 peers 10 MiB cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2017-09-26 10:48:07 Syncing #2186382 6843…e47c 0 hdr/s 0+ 0 Qed #2186382 19/50 peers 10 MiB cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2017-09-26 10:48:12 Syncing #2186382 6843…e47c 0 hdr/s 0+ 0 Qed #2186382 19/50 peers 10 MiB cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2017-09-26 10:48:17 Syncing #2186382 6843…e47c 0 hdr/s 0+ 0 Qed #2186382 19/50 peers 10 MiB cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2017-09-26 10:48:22 Syncing #2186382 6843…e47c 0 hdr/s 0+ 0 Qed #2186382 19/50 peers 10 MiB cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2017-09-26 10:48:27 Syncing #2186382 6843…e47c 0 hdr/s 0+ 0 Qed #2186382 19/50 peers 10 MiB cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2017-09-26 10:48:32 Syncing #2186382 6843…e47c 0 hdr/s 0+ 0 Qed #2186382 19/50 peers 10 MiB cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2017-09-26 10:48:37 Syncing #2186382 6843…e47c 0 hdr/s 0+ 0 Qed #2186382 19/50 peers 10 MiB cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2017-09-26 10:48:42 Syncing #2186382 6843…e47c 0 hdr/s 0+ 0 Qed #2186382 19/50 peers 10 MiB cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2017-09-26 10:48:47 Syncing #2186382 6843…e47c 0 hdr/s 0+ 0 Qed #2186382 19/50 peers 10 MiB cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2017-09-26 10:48:52 Syncing #2186382 6843…e47c 0 hdr/s 0+ 0 Qed #2186382 19/50 peers 10 MiB cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2017-09-26 10:48:57 Syncing #2186382 6843…e47c 0 hdr/s 0+ 0 Qed #2186382 19/50 peers 10 MiB cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2017-09-26 10:49:02 Syncing #2186382 6843…e47c 0 hdr/s 0+ 0 Qed #2186382 19/50 peers 10 MiB cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2017-09-26 10:49:07 Syncing #2186382 6843…e47c 0 hdr/s 0+ 0 Qed #2186382 19/50 peers 10 MiB cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2017-09-26 10:49:12 Syncing #2186382 6843…e47c 0 hdr/s 0+ 0 Qed #2186382 19/50 peers 10 MiB cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2017-09-26 10:49:17 Syncing #2186382 6843…e47c 0 hdr/s 0+ 0 Qed #2186382 19/50 peers 10 MiB cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
^C0 ✓ user@eros ~ $ ~/.opt/parity/target/release/parity --light --base-path /home/user/.local/share/io.parity.ethereum-08 --ports-shift 800 --node-key randomasdfstring
2017-09-26 10:50:01 Starting Parity/v1.8.0-unstable-7940bf6ec-20170921/x86_64-linux-gnu/rustc1.19.0
2017-09-26 10:50:01 Keys path /home/user/.local/share/io.parity.ethereum-08/keys/Foundation
2017-09-26 10:50:01 DB path /home/user/.local/share/io.parity.ethereum-08/chains/ethereum/db/906a34e69aec8c0d
2017-09-26 10:50:01 Path to dapps /home/user/.local/share/io.parity.ethereum-08/dapps
2017-09-26 10:50:01 Running in experimental Light Client mode.
2017-09-26 10:50:06 Public node URL: enode://726ade9e6d43b7c07b31bc03cd92128b63387c5b085e229516d9aaf78a3081b664ff80d0786f2566d9aae254653a43ff765a9ab710eb64fde765d82cff2d850c@172.16.16.11:31103
2017-09-26 10:50:06 Syncing #2186382 6843…e47c 0 hdr/s 0+ 0 Qed #2186382 12/50 peers 664 bytes cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2017-09-26 10:50:11 Syncing #2186382 6843…e47c 0 hdr/s 0+ 0 Qed #2186382 14/50 peers 664 bytes cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2017-09-26 10:50:16 Syncing #2186382 6843…e47c 0 hdr/s 0+ 0 Qed #2186382 16/50 peers 664 bytes cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2017-09-26 10:50:21 Syncing #2186382 6843…e47c 0 hdr/s 0+ 0 Qed #2186382 16/50 peers 664 bytes cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2017-09-26 10:50:26 Syncing #2186382 6843…e47c 0 hdr/s 0+ 0 Qed #2186382 14/50 peers 664 bytes cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
^C0 ✓ user@eros ~ $ ~/.opt/parity/target/release/parity --light --base-path /home/user/.local/share/io.parity.ethereum-08 --ports-shift 800 --node-key evenmorerandomasdfstring
2017-09-26 10:50:34 Starting Parity/v1.8.0-unstable-7940bf6ec-20170921/x86_64-linux-gnu/rustc1.19.0
2017-09-26 10:50:34 Keys path /home/user/.local/share/io.parity.ethereum-08/keys/Foundation
2017-09-26 10:50:34 DB path /home/user/.local/share/io.parity.ethereum-08/chains/ethereum/db/906a34e69aec8c0d
2017-09-26 10:50:34 Path to dapps /home/user/.local/share/io.parity.ethereum-08/dapps
2017-09-26 10:50:34 Running in experimental Light Client mode.
2017-09-26 10:50:40 Syncing #2186382 6843…e47c 0 hdr/s 0+ 0 Qed #2186382 13/50 peers 664 bytes cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2017-09-26 10:50:40 Public node URL: enode://4f0a5eb831d268beeb3a5fcaff12523f3f95591c44152877a4b95169229691e8bd392045c671c08751e3f0c1d68e1f38deda4afb16936083d518deb440abbc87@172.16.16.11:31103
2017-09-26 10:50:45 Syncing #2187366 ca7e…8029 196 hdr/s 3109+ 0 Qed #2187366 16/50 peers 639 KiB cache 3 MiB queue RPC: 0 conn, 0 req/s, 0 µs
2017-09-26 10:50:55 Syncing #2189750 96e0…a890 239 hdr/s 23508+ 0 Qed #2189750 16/50 peers 2 MiB cache 20 MiB queue RPC: 0 conn, 0 req/s, 0 µs
2017-09-26 10:51:00 Syncing #2190070 a567…948e 64 hdr/s 30098+ 0 Qed #2190070 15/50 peers 2 MiB cache 25 MiB queue RPC: 0 conn, 0 req/s, 0 µs
2017-09-26 10:51:05 Syncing #2191324 f3a3…b8d0 250 hdr/s 29997+ 0 Qed #2191324 15/50 peers 3 MiB cache 25 MiB queue RPC: 0 conn, 0 req/s, 0 µs
2017-09-26 10:51:10 Syncing #2192465 68d6…49c9 228 hdr/s 30009+ 0 Qed #2192465 16/50 peers 4 MiB cache 25 MiB queue RPC: 0 conn, 0 req/s, 0 µs
Edit, might have been a coincidence, it's stuck again.
I think we should add more verbose logging when credits run out along with a countdown for when we will retry. Then users like me will know to just be patient instead of thinking something is actually broken. Hopefully more peers that serve light clients will join the network. Is there an open issue tracking better peer discovery?
On a related note (maybe this is what https://github.com/paritytech/parity/issues/6010 is about), given that I already have a fully synced node, I don't really want to waste time validating any headers on my laptop. Is there some way to skip to a known good header? For example, my archival node has #4314162 8649…1a5e so I might as well start as close to there as I can. If I could do that, the light node wouldn't need to use so many credits fetching old headers and would be a much lighter load on the network.
Providing my own archive node via reserved peers helps to get the light client synced and gives me some control over "credits" (at least it feels like it does, not sure how they work). I'm now at block 3.7m with the light client on foundation.
hello ! same problem here +1
Parity/v1.8.0-beta-9882902-20171015/x86_64-linux-gnu/rustc1.20.0
Running in experimental Light Client mode.
same here v1.7.7, managed to get up to block 2.8mil ...
Same problem here - I'm using Parity/v1.9.0-nightly-ffee6aa-20171103/x86_64-macos/rustc1.21.0.
Stuck at this message:
2017-11-08 20:04:45 Syncing #2545959 7242…61b0 0 hdr/s 0+ 0 Qed #2545959 17/50 peers 664 bytes cache 0 bytes queue RPC: 1 conn, 0 req/s, 121 µs
I'm running parity --light.
+1 here. Stuck at 2023711. Parity version is 1.8.3
the same here, got stuck on light client
Eventually it will get unstuck and continue, it appears
same here, gets stuck... Sometimes I am able to get it running again by restarting the process but then after a few restarts it stops syncing again....
I was having this problem intermittently when trying to sync both the Kovan and Mainnet chains using parity --light. Since it was announced that everyone needs to update their Parity Kovan nodes for tomorrow's hard fork, the problem has disappeared on Parity. It persists with Mainnet light nodes unless I provide the light node with a list of known-good fully-updated Parity full nodes using --reserved-peers FILE --reserved-only.
This suggests to me that the problem is related to a lack of full nodes willing to sync Parity light clients and/or deficiencies in a Parity light client's ability to manage its connections to ensure that it only connects to full nodes that are willing to play ball. For a Parity Mainnet light client this might be especially challenging, because Geth is running Light Ethereum Subprotocol v2 (LESv2), so the universe of compatible LESv1-friendly full nodes might be fairly small.
Geth has geth --lightserv, which allows a full node to decide how much time it wants to spend serving light clients. Does Parity have anything like this? Is it hardcoded, and if so at what value? What Parity versions are willing and able to serve to light nodes?
@brandoncurtis exactly. Most of these "sync stuck" reports are simply due to having only a single serving peer which rate-limits the client. Right now the amount of time servers are willing to serve light clients is hardcoded but the machinery exists for it to be configurable.
Parity 1.7 and 1.8 should be serving light nodes
Is there any recommended workaround for this issue at the moment? I tried adding a bunch of random Parity>=1.7 nodes to my reserved peers list, but it doesn't seem to help all that much (or maybe at all). Increasing the min-peers count also does not lead to any improvement.
Same thing happens here. And normal warp does not fetch any snapshots. Been trying to sync for 5 days now.
Same issue as @fugroup
I ended up using geth and mist instead of parity, with the --fast option and --cache 512, synced in about 15 hours, 60 GB disk space used. Use an SSD disk or it will not work. The chain data count reached about 50 million. Light mode neither worked for geth nor parity, not enough peers.
Same problem. Syncs for a while then stalls.
PR #7848 considerably reduces the chances of sync being stuck. However I've noticed two other causes for this issue:
For some reason sometimes the sync is just veeeery slow. I've uploaded the logs, and as you can see we continuously receive headers even though we're at 0 hdr/sec. I don't really have an explanation for that. It looks like the sync is stuck, but given enough time it should successfully continue.
Sometimes we are stuck in the AncestorSearch::Prehistoric state. This happened to me once, after which I added more debugging instructions in order to get a better diagnostic. Unfortunately I haven't been able to reproduce since then.
For reason #1 (with the logs I uploaded), sync always seems to resume after a while.
Here are other example logs.
Sometimes we are stuck in the AncestorSearch::Prehistoric state
For now it should disconnect any peers which have prehistoric common ancestor, perhaps validated when first connecting. It's possible that we end up connected to a peer with the same genesis but different head (i.e. ETC vs ETH) which then causes us to discover and connect to more peers with the same incompatibility.
I'm still getting this, what's the fix?
Edit: restarting it has worked, so far. I'll update if it changes again.
Edit 2: it happened again at a later block and restarting won't fix it.
I rebooted and it's syncing again. I'll see how long it lasts.
Edit: It stopped and I had to restart again. This will probably work in the end, but it's getting tedious. Why does this happen?
@rphmeier @tomaka I could reproduce the reason 2 (AncestorSearch::Prehistoric state) with kovan. See issue here : https://github.com/paritytech/parity/issues/8383
The problem with this issue is that on my machine it happens only once every 10 hours or so, which makes debugging it very difficult. If you have a reproducible case, then it's very helpful.
@tomaka did you try on kovan too? It happens very very often for me (~15min) on kovan.
Here some logs
I'm having the same issue on foundation:
2018-04-25 14:51:35 Starting Parity/v1.10.2-beta-f4ae813fd-20180423/x86_64-windows-msvc/rustc1.25.0
2018-04-25 14:51:35 Running in experimental Light Client mode.
2018-04-25 14:51:36 Multi-threaded server is not available on Windows. Falling back to single thread.
2018-04-25 14:51:36 Multi-threaded server is not available on Windows. Falling back to single thread.
2018-04-25 14:51:41 Public node URL: enode://85be716e31c9626286c4dd7bbc89d48238608414543c8a4d09d63351bcddf42b25b36c1091391d3f1c36ceb9af297763a6e9c4840eb772f44803fd53b8dc54aa@127.0.0.1:30303
2018-04-25 14:51:41 Syncing #3089551 7383…4208 0 hdr/s 0+ 0 Qed #3089551 20/50 peers 554 bytes cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2018-04-25 14:51:46 Syncing #3089551 7383…4208 0 hdr/s 0+ 0 Qed #3089551 25/50 peers 554 bytes cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2018-04-25 14:51:51 Syncing #3089551 7383…4208 0 hdr/s 0+ 0 Qed #3089551 18/50 peers 554 bytes cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2018-04-25 14:51:56 Syncing #3089551 7383…4208 0 hdr/s 0+ 0 Qed #3089551 15/50 peers 554 bytes cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2018-04-25 14:52:01 Syncing #3089551 7383…4208 0 hdr/s 0+ 0 Qed #3089551 22/50 peers 554 bytes cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2018-04-25 14:52:06 Syncing #3089551 7383…4208 0 hdr/s 0+ 0 Qed #3089551 23/50 peers 554 bytes cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2018-04-25 14:52:11 Syncing #3089551 7383…4208 0 hdr/s 0+ 0 Qed #3089551 20/50 peers 554 bytes cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2018-04-25 14:52:16 Syncing #3089551 7383…4208 0 hdr/s 0+ 0 Qed #3089551 25/50 peers 554 bytes cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2018-04-25 14:52:21 Syncing #3089551 7383…4208 0 hdr/s 0+ 0 Qed #3089551 25/50 peers 554 bytes cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2018-04-25 14:52:26 Syncing #3089551 7383…4208 0 hdr/s 0+ 0 Qed #3089551 23/50 peers 554 bytes cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2018-04-25 14:52:31 Syncing #3089551 7383…4208 0 hdr/s 0+ 0 Qed #3089551 20/50 peers 554 bytes cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2018-04-25 14:52:36 Syncing #3089551 7383…4208 0 hdr/s 0+ 0 Qed #3089551 25/50 peers 554 bytes cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2018-04-25 14:52:41 Syncing #3089551 7383…4208 0 hdr/s 0+ 0 Qed #3089551 26/50 peers 554 bytes cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2018-04-25 14:52:46 Syncing #3089551 7383…4208 0 hdr/s 0+ 0 Qed #3089551 21/50 peers 554 bytes cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2018-04-25 14:52:51 Syncing #3089551 7383…4208 0 hdr/s 0+ 0 Qed #3089551 20/50 peers 554 bytes cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2018-04-25 14:52:56 Syncing #3089551 7383…4208 0 hdr/s 0+ 0 Qed #3089551 25/50 peers 554 bytes cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2018-04-25 14:53:01 Syncing #3089551 7383…4208 0 hdr/s 0+ 0 Qed #3089551 25/50 peers 554 bytes cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2018-04-25 14:53:06 Syncing #3089551 7383…4208 0 hdr/s 0+ 0 Qed #3089551 20/50 peers 554 bytes cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2018-04-25 14:53:11 Syncing #3089551 7383…4208 0 hdr/s 0+ 0 Qed #3089551 25/50 peers 554 bytes cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2018-04-25 14:53:16 Syncing #3089551 7383…4208 0 hdr/s 0+ 0 Qed #3089551 25/50 peers 554 bytes cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2018-04-25 14:53:21 Syncing #3089551 7383…4208 0 hdr/s 0+ 0 Qed #3089551 22/50 peers 554 bytes cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2018-04-25 14:53:26 Syncing #3089551 7383…4208 0 hdr/s 0+ 0 Qed #3089551 25/50 peers 554 bytes cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2018-04-25 14:53:31 Syncing #3089551 7383…4208 0 hdr/s 0+ 0 Qed #3089551 25/50 peers 554 bytes cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2018-04-25 14:53:36 Syncing #3089551 7383…4208 0 hdr/s 0+ 0 Qed #3089551 20/50 peers 554 bytes cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2018-04-25 14:53:41 Syncing #3089551 7383…4208 0 hdr/s 0+ 0 Qed #3089551 19/50 peers 554 bytes cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2018-04-25 14:54:48 Syncing #3089551 7383…4208 0 hdr/s 0+ 0 Qed #3089551 24/50 peers 554 bytes cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2018-04-25 14:54:56 Syncing #3089551 7383…4208 0 hdr/s 0+ 0 Qed #3089551 26/50 peers 554 bytes cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2018-04-25 14:55:02 Syncing #3089551 7383…4208 0 hdr/s 0+ 0 Qed #3089551 20/50 peers 554 bytes cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2018-04-25 14:55:07 Syncing #3089551 7383…4208 0 hdr/s 0+ 0 Qed #3089551 25/50 peers 554 bytes cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2018-04-25 14:55:12 Syncing #3089551 7383…4208 0 hdr/s 0+ 0 Qed #3089551 25/50 peers 554 bytes cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2018-04-25 14:55:17 Syncing #3089551 7383…4208 0 hdr/s 0+ 0 Qed #3089551 25/50 peers 554 bytes cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2018-04-25 14:55:22 Syncing #3089551 7383…4208 0 hdr/s 0+ 0 Qed #3089551 20/50 peers 554 bytes cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2018-04-25 14:55:27 Syncing #3089551 7383…4208 0 hdr/s 0+ 0 Qed #3089551 25/50 peers 554 bytes cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2018-04-25 14:55:32 Syncing #3089551 7383…4208 0 hdr/s 0+ 0 Qed #3089551 25/50 peers 554 bytes cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2018-04-25 14:55:37 Syncing #3089551 7383…4208 0 hdr/s 0+ 0 Qed #3089551 26/50 peers 554 bytes cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2018-04-25 14:55:42 Syncing #3089551 7383…4208 0 hdr/s 0+ 0 Qed #3089551 20/50 peers 554 bytes cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2018-04-25 14:55:47 Syncing #3089551 7383…4208 0 hdr/s 0+ 0 Qed #3089551 25/50 peers 554 bytes cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2018-04-25 14:55:52 Syncing #3089551 7383…4208 0 hdr/s 0+ 0 Qed #3089551 25/50 peers 554 bytes cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
Is there a workaround for this problem ?
@ilhanu the workaround for mainnet is to use v1.11 that will be released soon (here). Right now you can build it yourself from the source. The v1.11 uses hardcoded block headers that prevents you from having to download them.
There are also binaries for 1.11 if you want to test light sync before it is actually released.
v1.11 doesn't change much, I deleted the cache folder in 1.10 and started anew.
Here you can see the output with 1.11 - binary I downloaded from sources of @5chdn
C:\Program Files\Parity Technologies\Parity>parity --light
2018-04-25 16:01:53 Starting Parity/v1.11.0-nightly-baeda9347-20180424/x86_64-windows-msvc/rustc1.25.0
2018-04-25 16:01:53 Running in experimental Light Client mode.
2018-04-25 16:01:54 Multi-threaded server is not available on Windows. Falling back to single thread.
2018-04-25 16:01:54 Multi-threaded server is not available on Windows. Falling back to single thread.
2018-04-25 16:01:58 Public node URL: enode://85be716e31c9626286c4dd7bbc89d48238608414543c8a4d09d63351bcddf42b25b36c1091391d3f1c36ceb9af297763a6e9c4840eb772f44803fd53b8dc54aa@127.0.0.1:30303
2018-04-25 16:02:04 Syncing #3469376 0x25bc…0acc 880 hdr/s 30085+ 0 Qed #3469376 8/50 peers 5 MiB cache 22 MiB queue RPC: 0 conn, 0 req/s, 0 µs
2018-04-25 16:02:09 Syncing #3477307 0xdb96…8483 1576 hdr/s 29704+ 4 Qed #3477307 8/50 peers 9 MiB cache 22 MiB queue RPC: 0 conn, 0 req/s, 0 µs
2018-04-25 16:02:14 Syncing #3482037 0x8eaf…c9c5 941 hdr/s 29574+ 14 Qed #3482037 8/50 peers 10 MiB cache 22 MiB queue RPC: 0 conn, 0 req/s, 0 µs
2018-04-25 16:02:19 Syncing #3489835 0xac37…6dac 1549 hdr/s 29572+ 0 Qed #3489835 10/50 peers 10 MiB cache 22 MiB queue RPC: 0 conn, 0 req/s, 0 µs
2018-04-25 16:02:24 Syncing #3497524 0xf2cf…b8bb 1537 hdr/s 28288+ 23 Qed #3497524 7/50 peers 10 MiB cache 21 MiB queue RPC: 0 conn, 0 req/s, 0 µs
2018-04-25 16:02:29 Syncing #3504674 0x0c75…6011 1421 hdr/s 29982+ 0 Qed #3504674 3/50 peers 10 MiB cache 22 MiB queue RPC: 0 conn, 0 req/s, 0 µs
2018-04-25 16:02:34 Syncing #3509999 0x6374…585e 1064 hdr/s 30042+ 0 Qed #3509999 6/50 peers 10 MiB cache 22 MiB queue RPC: 0 conn, 0 req/s, 0 µs
2018-04-25 16:02:34 Error removing stale DAG cache: Os { code: 5, kind: PermissionDenied, message: "Access is denied." }
2018-04-25 16:02:39 Syncing #3516839 0x2231…c547 1359 hdr/s 29465+ 0 Qed #3516839 5/50 peers 10 MiB cache 22 MiB queue RPC: 0 conn, 0 req/s, 0 µs
2018-04-25 16:02:44 Syncing #3524414 0x142a…5747 1509 hdr/s 29320+ 4 Qed #3524414 6/50 peers 10 MiB cache 22 MiB queue RPC: 0 conn, 0 req/s, 0 µs
2018-04-25 16:02:49 Syncing #3525841 0xe718…d472 284 hdr/s 29277+ 0 Qed #3525841 6/50 peers 10 MiB cache 22 MiB queue RPC: 0 conn, 0 req/s, 0 µs
2018-04-25 16:02:54 Error removing stale DAG cache: Os { code: 5, kind: PermissionDenied, message: "Access is denied." }
2018-04-25 16:02:59 Syncing #3525841 0xe718…d472 0 hdr/s 29295+ 0 Qed #3525841 4/50 peers 10 MiB cache 22 MiB queue RPC: 0 conn, 0 req/s, 0 µs
2018-04-25 16:03:04 Syncing #3525841 0xe718…d472 0 hdr/s 29586+ 0 Qed #3525841 6/50 peers 10 MiB cache 22 MiB queue RPC: 0 conn, 0 req/s, 0 µs
2018-04-25 16:03:09 Syncing #3525841 0xe718…d472 0 hdr/s 26999+ 0 Qed #3525841 4/50 peers 10 MiB cache 20 MiB queue RPC: 0 conn, 0 req/s, 0 µs
2018-04-25 16:03:14 Syncing #3525841 0xe718…d472 0 hdr/s 21115+ 0 Qed #3525841 3/50 peers 10 MiB cache 16 MiB queue RPC: 0 conn, 0 req/s, 0 µs
2018-04-25 16:03:15 Error removing stale DAG cache: Os { code: 5, kind: PermissionDenied, message: "Access is denied." }
2018-04-25 16:03:19 Syncing #3525841 0xe718…d472 0 hdr/s 21115+ 0 Qed #3525841 2/50 peers 10 MiB cache 16 MiB queue RPC: 0 conn, 0 req/s, 0 µs
2018-04-25 16:03:56 Syncing #3525841 0xe718…d472 0 hdr/s 21115+ 0 Qed #3525841 3/50 peers 10 MiB cache 16 MiB queue RPC: 0 conn, 0 req/s, 0 µs
2018-04-25 16:04:04 Syncing #3525841 0xe718…d472 0 hdr/s 7194+ 0 Qed #3525841 3/50 peers 10 MiB cache 5 MiB queue RPC: 0 conn, 0 req/s, 0 µs
2018-04-25 16:04:09 Syncing #3525841 0xe718…d472 0 hdr/s 0+ 0 Qed #3525841 1/50 peers 10 MiB cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2018-04-25 16:04:14 Syncing #3525841 0xe718…d472 0 hdr/s 0+ 0 Qed #3525841 4/50 peers 10 MiB cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2018-04-25 16:04:19 Syncing #3525841 0xe718…d472 0 hdr/s 0+ 0 Qed #3525841 3/50 peers 10 MiB cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2018-04-25 16:04:24 Syncing #3525841 0xe718…d472 0 hdr/s 0+ 0 Qed #3525841 4/50 peers 10 MiB cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2018-04-25 16:04:29 Syncing #3525841 0xe718…d472 0 hdr/s 0+ 0 Qed #3525841 3/50 peers 10 MiB cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2018-04-25 16:04:34 Syncing #3525841 0xe718…d472 0 hdr/s 0+ 0 Qed #3525841 5/50 peers 10 MiB cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2018-04-25 16:04:39 Syncing #3525841 0xe718…d472 0 hdr/s 0+ 0 Qed #3525841 3/50 peers 10 MiB cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2018-04-25 16:04:44 Syncing #3525841 0xe718…d472 0 hdr/s 0+ 0 Qed #3525841 3/50 peers 10 MiB cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2018-04-25 16:04:49 Syncing #3525841 0xe718…d472 0 hdr/s 0+ 0 Qed #3525841 4/50 peers 10 MiB cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2018-04-25 16:05:52 Syncing #3525841 0xe718…d472 0 hdr/s 0+ 0 Qed #3525841 5/50 peers 10 MiB cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2018-04-25 16:04:59 Syncing #3525841 0xe718…d472 0 hdr/s 0+ 0 Qed #3525841 4/50 peers 10 MiB cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2018-04-25 16:04:54 Syncing #3525841 0xe718…d472 0 hdr/s 0+ 0 Qed #3525841 4/50 peers 10 MiB cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2018-04-25 16:05:59 Syncing #3525841 0xe718…d472 0 hdr/s 0+ 0 Qed #3525841 6/50 peers 10 MiB cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2018-04-25 16:06:04 Syncing #3525841 0xe718…d472 0 hdr/s 0+ 0 Qed #3525841 7/50 peers 10 MiB cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2018-04-25 16:06:09 Syncing #3525841 0xe718…d472 0 hdr/s 0+ 0 Qed #3525841 6/50 peers 10 MiB cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
C:\Program Files\Parity Technologies\Parity>parity --light
2018-04-25 16:06:48 Starting Parity/v1.11.0-nightly-baeda9347-20180424/x86_64-windows-msvc/rustc1.25.0
2018-04-25 16:06:48 Running in experimental Light Client mode.
2018-04-25 16:06:48 Multi-threaded server is not available on Windows. Falling back to single thread.
2018-04-25 16:06:48 Multi-threaded server is not available on Windows. Falling back to single thread.
2018-04-25 16:06:53 Public node URL: enode://85be716e31c9626286c4dd7bbc89d48238608414543c8a4d09d63351bcddf42b25b36c1091391d3f1c36ceb9af297763a6e9c4840eb772f44803fd53b8dc54aa@127.0.0.1:30303
2018-04-25 16:06:58 Syncing #3525841 0xe718…d472 0 hdr/s 0+ 0 Qed #3525841 6/50 peers 553 bytes cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2018-04-25 16:07:03 Syncing #3525841 0xe718…d472 0 hdr/s 0+ 0 Qed #3525841 4/50 peers 553 bytes cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2018-04-25 16:07:08 Syncing #3525841 0xe718…d472 0 hdr/s 0+ 0 Qed #3525841 6/50 peers 553 bytes cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2018-04-25 16:07:13 Syncing #3525841 0xe718…d472 0 hdr/s 0+ 0 Qed #3525841 6/50 peers 553 bytes cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2018-04-25 16:07:18 Syncing #3525841 0xe718…d472 0 hdr/s 0+ 0 Qed #3525841 5/50 peers 553 bytes cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2018-04-25 16:07:23 Syncing #3525841 0xe718…d472 0 hdr/s 0+ 0 Qed #3525841 6/50 peers 553 bytes cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2018-04-25 16:07:28 Syncing #3525841 0xe718…d472 0 hdr/s 0+ 0 Qed #3525841 5/50 peers 553 bytes cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2018-04-25 16:07:33 Syncing #3525841 0xe718…d472 0 hdr/s 0+ 0 Qed #3525841 4/50 peers 553 bytes cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2018-04-25 16:07:38 Syncing #3525841 0xe718…d472 0 hdr/s 0+ 0 Qed #3525841 5/50 peers 553 bytes cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2018-04-25 16:07:43 Syncing #3525841 0xe718…d472 0 hdr/s 0+ 0 Qed #3525841 6/50 peers 553 bytes cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2018-04-25 16:07:48 Syncing #3525841 0xe718…d472 0 hdr/s 0+ 0 Qed #3525841 6/50 peers 553 bytes cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2018-04-25 16:07:53 Syncing #3525841 0xe718…d472 0 hdr/s 0+ 0 Qed #3525841 5/50 peers 553 bytes cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2018-04-25 16:07:58 Syncing #3525841 0xe718…d472 0 hdr/s 0+ 0 Qed #3525841 7/50 peers 553 bytes cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
@ilhanu you need to parity --light db kill first, then relaunch.
I'm having the same issue, but also whenever this happens my I/O is going absolutely through the roof. Can anyone confirm the same?
Using iotop my I/O is at 100%, often 'more', constantly once I stop receiving headers.
This should be solved by now.
i can confirm i have not had any syncing problems with at least 6 recent light nodes/sync
good job
Still observing this.
@Dominator008
The code base has changed a bit lately, can you open a new issue or alternatively provide some more details such as which version, how many peers you got connected and so on?
Would be good if you could run parity --light --l trace=pip,sync and provide it!
Thanks
I am experiencing this issue with light client on classic chain
sync=debug reports:
Sync target reached. Going idle
2019-04-15 16:20:23 IO Worker #2 DEBUG sync Maintaining sync (AncestorSearch(Awaiting(ReqId(89), 7856016, Complete { start: Number(7856016), skip: 0, max: 64, reverse: true })))
2019-04-15 16:20:23 IO Worker #0 DEBUG sync Maintaining sync (AncestorSearch(Awaiting(ReqId(89), 7856016, Complete { start: Number(7856016), skip: 0, max: 64, reverse: true })))
2019-04-15 16:20:23 IO Worker #2 DEBUG sync Maintaining sync (AncestorSearch(Awaiting(ReqId(89), 7856016, Complete { start: Number(7856016), skip: 0, max: 64, reverse: true })))
2019-04-15 16:20:23 IO Worker #3 DEBUG sync Maintaining sync (AncestorSearch(Awaiting(ReqId(89), 7856016, Complete { start: Number(7856016), skip: 0, max: 64, reverse: true })))
2019-04-15 16:20:23 IO Worker #1 DEBUG sync Maintaining sync (AncestorSearch(Awaiting(ReqId(89), 7856016, Complete { start: Number(7856016), skip: 0, max: 64, reverse: true })))
2019-04-15 16:20:23 IO Worker #2 DEBUG sync Maintaining sync (AncestorSearch(Awaiting(ReqId(89), 7856016, Complete { start: Number(7856016), skip: 0, max: 64, reverse: true })))
2019-04-15 16:20:23 IO Worker #0 DEBUG sync Maintaining sync (AncestorSearch(Awaiting(ReqId(89), 7856016, Complete { start: Number(7856016), skip: 0, max: 64, reverse: true })))
2019-04-15 16:20:23 IO Worker #1 DEBUG sync Maintaining sync (AncestorSearch(Awaiting(ReqId(89), 7856016, Complete { start: Number(7856016), skip: 0, max: 64, reverse: true })))
2019-04-15 16:20:23 IO Worker #0 DEBUG sync Maintaining sync (AncestorSearch(Awaiting(ReqId(89), 7856016, Complete { start: Number(7856016), skip: 0, max: 64, reverse: true })))
2019-04-15 16:20:23 IO Worker #2 DEBUG sync Maintaining sync (AncestorSearch(Awaiting(ReqId(89), 7856016, Complete { start: Number(7856016), skip: 0, max: 64, reverse: true })))
2019-04-15 16:20:23 IO Worker #2 DEBUG sync Maintaining sync (AncestorSearch(Awaiting(ReqId(89), 7856016, Complete { start: Number(7856016), skip: 0, max: 64, reverse: true })))
2019-04-15 16:20:23 IO Worker #0 DEBUG sync Maintaining sync (AncestorSearch(Awaiting(ReqId(89), 7856016, Complete { start: Number(7856016), skip: 0, max: 64, reverse: true })))
2019-04-15 16:20:23 IO Worker #1 DEBUG sync Maintaining sync (AncestorSearch(Queued(7856016)))
2019-04-15 16:20:24 IO Worker #2 DEBUG sync Found common ancestor with best chain
2019-04-15 16:20:24 IO Worker #2 DEBUG sync Maintaining sync (AncestorSearch(FoundCommon(7856016, 0x1ca1a09a7217eae058c3e03e53bcbb071a6a79f8a4fec8603c6a37bf84a91a48)))
2019-04-15 16:20:24 IO Worker #3 INFO import Syncing #7856016 0x1ca1…1a48 0.0 hdr/s 0+ 0 Qed 45/100 peers 560 bytes cache 0 bytes queue RPC: 0 conn, 0 req/s, 0 µs
2019-04-15 16:20:27 IO Worker #3 DEBUG sync Maintaining sync (Rounds(Aborted: TargetReached, 0 remain))
2019-04-15 16:20:27 IO Worker #3 DEBUG sync Sync target reached. Going idle
Most helpful comment
same here, gets stuck... Sometimes I am able to get it running again by restarting the process but then after a few restarts it stops syncing again....