Pytorch-lightning: multiprocessing cpu only training

Created on 12 Sep 2019  路  7Comments  路  Source: PyTorchLightning/pytorch-lightning

I would like to know if it's possible to train a model with multiprocess parallelism (no GPU available) using Lightning (sync analogue of https://pytorch.org/docs/stable/notes/multiprocessing.html#hogwild) ? After a quick glance, I've the impression that in Trainer all available options for parallelism are GPU based (if I'm not mistaken torch.DPD supports multiproc CPU-only training).

enhancement good first issue question

All 7 comments

good question. we only support distributed GPU but would welcome a PR to support multi-CPU options.

Hah! I was just looking for this. If nothing else, it would make testing DDP much easier.

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.

@artemru @neggert are you interested in sending a PR? :robot:

I would like to take this up.

Cool! Thx @skepticleo

Looks like this was recently added in #1158 :)

Was this page helpful?
0 / 5 - 0 ratings

Related issues

chuong98 picture chuong98  路  3Comments

williamFalcon picture williamFalcon  路  3Comments

polars05 picture polars05  路  3Comments

iakremnev picture iakremnev  路  3Comments

monney picture monney  路  3Comments