Pytorch-lightning: is limit_train_batches shuffle or random

Created on 12 Aug 2020  路  6Comments  路  Source: PyTorchLightning/pytorch-lightning

hi, I am using limit_train_batches . If it is set, is it means a subdataset of whole train dataset ? similar with torch.utils.data.random_split

question

Most helpful comment

@awaelchli @ydcjeff thx

All 6 comments

Hi! thanks for your contribution!, great first issue!

Yes, it is a subset of the train dataset
But, it doesn't similar with random_split

@ydcjeff I mean, is it random?

I think it is not random. It is the first limit_train_batches of the train dataset.

Yes exactly, @ydcjeff is right. It will fetch batches from the dataloader until it reaches that amount, so your dataset and dataloader settings regarding shuffling will be respected.

@awaelchli @ydcjeff thx

Was this page helpful?
0 / 5 - 0 ratings