Pytorch-lightning: This is really a bad idea. what if someone really don't want to sample/shuffle their data during training.

Created on 7 Aug 2020  路  3Comments  路  Source: PyTorchLightning/pytorch-lightning

All 3 comments

Hi! thanks for your contribution!, great first issue!

Most users want that. But if you don't, then you can turn it off by setting the flag in the Trainer and provide your own sampler.
https://pytorch-lightning.readthedocs.io/en/latest/trainer.html#replace-sampler-ddp
Feel free to reopen if that doesn't solve your problem.

Most users want that. But if you don't, then you can turn it off by setting the flag in the Trainer and provide your own sampler.
https://pytorch-lightning.readthedocs.io/en/latest/trainer.html#replace-sampler-ddp
Feel free to reopen if that doesn't solve your problem.

Thank you awaelchli. This solves my problem!

Was this page helpful?
0 / 5 - 0 ratings