Pytorch-lightning: Incompatible Trainer.test()

Created on 3 May 2020  ·  9Comments  ·  Source: PyTorchLightning/pytorch-lightning

🐛 Bug

Trainer.test() stops on missing definitions of train & validation dataloaders methods.

To Reproduce

Steps to reproduce the behavior:

  • Define LightningModule without train & validation dataloaders
  • Train such model with data loaders provided to Trainer.fit(...) method
  • Load model from checkpoint and test it with Trainer.test(...) method

test failes on :
~python
'No train_dataloader() method defined. Lightning Trainer expects as minimum a'
pytorch_lightning.utilities.exceptions.MisconfigurationException: No train_dataloader() method defined. Lightning Trainer expects as minimum a training_step(), training_dataloader() and configure_optimizers() to be defined.
~

Code sample

~python
test_data = data_loader(...)
estimator = MyEstimator.load_from_checkpoint(checkpoint_path=CKPT_PATH)
trainer = pl.Trainer(...)
trainer.test(model=estimator, test_dataloaders=test_data)
~

Expected behaviour

Run the test with provided test data loader, and do not stop on condition related to missing train & validation dataloaders which actually are not mandatory.

Environment

~~~

  • Packages:
    - numpy: 1.18.4
    - pyTorch_debug: False
    - pyTorch_version: 1.5.0
    - pytorch-lightning: 0.7.5
    - tensorboard: 2.2.1
    - tqdm: 4.46.0
  • System:
    - OS: Linux
    - architecture:
    - 64bit
    - ELF
    - processor:
    - python: 3.8.2
    - version: #1 SMP Wed, 29 Apr 2020 16:23:03 +0000
    ~~~
Priority P0 bug / fix help wanted

All 9 comments

Hi! thanks for your contribution!, great first issue!

This issue relates to #1195, fit and test should be decoupled. Is there any consensus or proposition of further steps?

Can't we just add

if not self.testing:
    self.check_model_configuration(model)

here because during testing the testing configuraton is already checked here.

Also, using trainer.test(model) on loaded checkpoint doesn't work without explicitly passing test_dataloader parameter even if all *_dataloader methods are defined. Example.
This use case isn't documented in the 0.7.5 documentation but evidently was relevant at some point: https://pytorch-lightning.readthedocs.io/en/latest/test_set.html#test-pre-trained-model.
I suggest to either:

  • fix Test set documentation
  • (preferred) allow to call trainer.test(model) without specifying test_dataloader parameter. After all, what's the point of defining Model.test_dataloader method?

@iakremnev can you fix the link? It's asking for request access.

@rohitgr7 oops, fixed it.

Error coming from this line. Why is it checking whether test_dataloader() returns something or not. If we just check whether it is overridden or not wouldn't that be enough just like it is done in case of .fit() here.

I would suggest

gave_test_loader = isinstance(model.test_dataloader, _PatchDataLoader) or self.is_overridden('test_dataloader', model)

here, and

if not self.testing:
    self.check_model_configuration(model)

here

it is the same as #1754

Was this page helpful?
0 / 5 - 0 ratings

Related issues

williamFalcon picture williamFalcon  ·  3Comments

polars05 picture polars05  ·  3Comments

monney picture monney  ·  3Comments

as754770178 picture as754770178  ·  3Comments

DavidRuhe picture DavidRuhe  ·  3Comments