I'm running:
- Which Parity version?: 1.11.8
- Which operating system?: Linux
- How installed?: -
- Are you fully synchronized?: -
- Which network are you connected to?: -
- Did you try to restart the node?: -
In documentation https://wiki.parity.io/Configuring-Parity-Ethereum.html please clarify what are
--jsonrpc-threads and --jsonrpc-server-threads parameters:
jsonrpc-server-threads is multi-threaded HTTP server (can handle multiple incoming requests at the same time),
jsonrpc-threads is a cpupool to dispatch all RPC requests to (shared between all transports) so when you have some heavy calls they are dispatched in parallel on different threads.
:
WebSockets and IPC are single-threaded transports that can dispatch deserialized requests on multiple threads so that they are processed in parallel. So deserialization happens sequentially, while processing the request is done in parallel.
HTTP can support multiple threads, so you can deserialize requests in parallel and then run them in parallel as well.
Fine tuning performance may involve changing both parameters depending on the transports you are using and characteristic you get (timeouts).
For instance you may run with --jsonrpc-server-threads 8 --jsonrpc-threads 0 which will be able to handle 8 parallel requests over HTTP (deserialization + processing), but while these 8 requests are being processed the HTTP server will be blocked (won't accept more connections).
Running with --jsonrpc-server-threads 3 --jsonrpc-threads 4 however allows you to process 4 requests in parallel, while you can deserialize 3 incoming requests in parallel. While processing threads are occupied you are still accepting incoming HTTP requests, but the responses might take long.
coypright @tomusdrw for those explanations
@phahulin PR are always welcome if you feel you can come out with a concise and better explanation for those flags: https://github.com/paritytech/parity/blob/master/parity/cli/mod.rs
@Tbaut thank you. I'd like to clarify a few more things:
jsonrpc-threads is a cpupool, it doesn't make sense to set this value higher than number of cpu cores, right?server_threads, processing_threads (for here https://github.com/paritytech/parity-ethereum/blob/1b1941a896c8485a05fa3e9ffe8b251b498eb541/parity/cli/mod.rs#L486)Running with --jsonrpc-server-threads 3 --jsonrpc-threads 4 however allows you to process 4 requests in parallel, while you can deserialize 3 incoming requests in parallel. While processing threads are occupied you are still accepting incoming HTTP requests, but the responses might take long.
Does this mean that there are 7 threads—3 server threads processing HTTP requests, deserializing the JSON and then passing them onto the 4 processing threads which do all of the processing and then passes the response back to the HTTP server thread, ie. does the server thread not do any processing in this case? If you have more server threads than processing threads, do requests for processing queue up?
@dtran320 Correct, the server threads are just handling incoming connections and do serialization (no processing), then dispatch to a cpupool awaiting for a future to completion and when the response is ready they are responsible for replying.
Processing requests may obviously queue up in the pool.
Although please note that this behaviour has changed on latest master after #9657, currently processing_threads are not used at all, I hope to look into either removing that param or restoring the cpupool after conducting some performance testing.
Hey guys. I am trying to send a lot of requests to my Parity private network. Each node has 2 cores (docker containers) and I am trying to achieve thousands of JSON-RPC tx/s spread over the network. I am using a workload generator (tung).
I observed that my network performs well till 300 tx/s, after that Parity starts to ignore RPC requests and throughput decreases. I tried to set server_threads = 2 (since I just have 2 cores per node)in the config.toml of each node and I increased performance a bit. Please could you confirm me that increasing the cores and thus the server threads I will be able to handle more requests? Are the node cars my issue?
what is the default for server_threads ?
Most helpful comment
server_threads,processing_threads(for here https://github.com/paritytech/parity-ethereum/blob/1b1941a896c8485a05fa3e9ffe8b251b498eb541/parity/cli/mod.rs#L486)