Pytorch-lightning: Is it possible to make `validation_step` and `val_dataloader` no-ops?

Created on 9 Aug 2019  Â·  5Comments  Â·  Source: PyTorchLightning/pytorch-lightning

Is your feature request related to a problem? Please describe.
Sometimes I don't have a separate validation split, only a train/test split. I'm trying out pytorch-lightning to prototype / experiment, and trying to see what the best of way of doing this is.

I could make the train dataset and then do torch.utils.data.random_split or use torch.utils.data.SubsetRandomSampler to build a validation set as well, but if I don't have enough data (or just don't want to do a separate validation step) this isn't ideal.

Describe the solution you'd like
I'd like to be able to implement only the training_step, train_dataloader, and test_dataloader methods and then have the validation step and validation metrics be omitted (maybe explicit no-ops). Right now, I'm experimenting with having an empty DataLoader for the validation data.

Describe alternatives you've considered

  • Implement val_dataloader with an empty (dummy) DataLoader

    • Not sure if this will work yet (if lightning will still call validation_step and validation_end).

enhancement help wanted

Most helpful comment

live on master now

All 5 comments

great idea. val call can just be made optional! very easy to do.

do you want to give it a shot?

  1. remove notimplemented warning. add pass instead. (from validation_step)
  2. allow get val_dataloader to be none.
  3. in val loop check if val_loader is none (at very beginning). return if it is none.

Thanks, I'll take a stab at that this weekend!

live on master now

Dang, too fast for me :) Thanks for working on this!

Sorry for writing to this thread but how can I use it in my model? Is it enough to do something like:

@pl.data_loader
def val_dataloader(self):
     if has_valset:        
        return self.__dataloader(train=False)

    # No validation set
    return None

After that do I need to do something like:

def validation_step(self, batch, batch_idx):
    # Is this check necessary?
    if not has_vaset:
        return

   # process the validation batch
Was this page helpful?
0 / 5 - 0 ratings