I'm wanting to setup a server for North America, I've been doing some research but their some different numbers showing up. Its seems like 64 GB of RAM is good enough for pre-processing, I was wondering if 10 GB is good for runtime. Also how powerful of a CPU is acceptable to get this done in a timely manner, I'm hoping for something reasonable not too expensive.
I recently processed the north-america
extract from Geofabrik with a default car profile and the CH
toolchain on v5.19.0
. Running osrm-routed
on the resulting *.osrm
files is loading 23.1G of data in RAM.
Thank you, what CPU did you use?
I have launched latest(v5.22.0
) OSRM with MLD year ago, here's some profile data for your reference:
osrm-extract
, osrm-partition
, osrm-customize
) max RAM requires more than 64GB ([info] RAM: peak bytes used: 64719273984
)osrm-routed
) requires almost same with compiled file size, about 35GB. Here's some profile data, FYI: https://github.com/Telenav/open-source-spec/blob/master/osrm/doc/osrm-profile.md#na-mld
Now I'm using AWS r5.2xlarge, i.e. 8 cpus, 64GB memory instance, configed about 20GB swap to make sure enough memory for pre-processing too.
Be aware that there's a resources/time tradeoff as well - the more threads/CPUs you throw at it, the more RAM required. I don't have exact numbers, but much of the processing is highly concurrent - each additional thread/CPU doing work requires its own block of RAM.
If you have lots of CPUs and limited RAM, you can still sometimes get away with processing by limiting the concurrency (via the --threads
parameter).
Most helpful comment
Be aware that there's a resources/time tradeoff as well - the more threads/CPUs you throw at it, the more RAM required. I don't have exact numbers, but much of the processing is highly concurrent - each additional thread/CPU doing work requires its own block of RAM.
If you have lots of CPUs and limited RAM, you can still sometimes get away with processing by limiting the concurrency (via the
--threads
parameter).