I'd like to be able to cluster a process, but have one of the processes in the cluster play a special role, in this case to act as a garbage collector:
[
{
"name" : "server",
"script" : "./server.js",
"instances" : "3",
"exec_mode" : "cluster_mode",
"port" : 9090,
},
{
"name" : "server-gc",
"script" : "./server.js",
"args" : "['--role=gc']",
"instances" : "1",
"exec_mode" : "cluster_mode",
"port" : 9090,
}
]
This does not work because PM2 refuses the start the second script. Is there any way to do something similar to this with PM2, or am I going to have to code my own special purpose solution?
I think that you can't do this and won't be able to do it with pm2. A solution would be to clone your ./server.js and to run it as another process.
Issue will stay open in case you got a solution.
I ended up coding my own special purpose daemon script instead of using PM2. It would be nice if this was a feature of PM2, but it turned out to be easier to write my own process manager.
pm2 start -f ./path/to/your/config.json, just add "-f" flag to force execution of the same script twice
Yup that's a nice fix thanks @rlidwka!
Ah very nice. Just to be clear: @rlidwka does this also properly cluster the four resulting process instances, or does it just force the execution of the fourth instance without adding it to the cluster that the first 3 are running in?
does this also properly cluster the four resulting process instances
Yes, from pm2's point of view all four processes are treated equally. You can even run completely different code, it would be added to cluster as well.
It happens, because pm2 manages just one single cluster, and all processes get added to it.
But there is no guarantees that 4th instance will ever receive any requests, in node 0.10 all requests could easily be sent to 1st and 2nd one (if they're responding quickly enough). I think it had been fixed in 0.11, you need to test that if it's important for you.
Okay thanks for the explanation. At this point I've already switched to using a handcoded system that uses Nginx as the loadbalancer / proxy upstream from a cluster of processes each running on a different port, but I'm sure this will help someone else who has a similar issue.
I created a symlink to the script and start that one.
4 Years later we still don't have this natively without a -f flag?
I created a symlink to the script and start that one.
thanks for the workaround
it's 2019 and we still don't have this feature, the "-f" flag doesn't work for me either
Most helpful comment
pm2 start -f ./path/to/your/config.json, just add "-f" flag to force execution of the same script twice