Pytorch-lightning: This is really a bad idea. what if someone really don't want to sample/shuffle their data during training.

Created on 7 Aug 2020  路  3Comments  路  Source: PyTorchLightning/pytorch-lightning

All 3 comments

Hi! thanks for your contribution!, great first issue!

Most users want that. But if you don't, then you can turn it off by setting the flag in the Trainer and provide your own sampler.
https://pytorch-lightning.readthedocs.io/en/latest/trainer.html#replace-sampler-ddp
Feel free to reopen if that doesn't solve your problem.

Most users want that. But if you don't, then you can turn it off by setting the flag in the Trainer and provide your own sampler.
https://pytorch-lightning.readthedocs.io/en/latest/trainer.html#replace-sampler-ddp
Feel free to reopen if that doesn't solve your problem.

Thank you awaelchli. This solves my problem!

Was this page helpful?
0 / 5 - 0 ratings

Related issues

dsuess picture dsuess  路  26Comments

polars05 picture polars05  路  34Comments

mpariente picture mpariente  路  53Comments

Borda picture Borda  路  60Comments

Darktex picture Darktex  路  26Comments