Pytorch-lightning: Call load_from_checkpoint when trainer load state

Created on 18 Mar 2020  ·  4Comments  ·  Source: PyTorchLightning/pytorch-lightning

🚀 Feature

When use Trainer class to load the model state instead of using load_from_checkpoint, the hook on_load_checkpoint won't be called. It's better to let trainer call load_from_checkpoint instead of directly call model.load_state_dict, thus on_load_checkpoint and other hooks can be called.

Motivation

I need to do some sanity check of saved hparams and I found the suitable location for this is on_load_checkpoint. But this method is not called when use Trainer to load states.

discussion enhancement help wanted

All 4 comments

to me it makes sense call on_load_checkpoint inside load_from_checkpoint :]
@jeffling @hadim ano thoughts?

This is a good change, but we may want to think about the versioning here since it may break some codebases.

@PyTorchLightning/core-contributors any other thoughts?

I just hit this issue, and would love it in the next release!

Was this page helpful?
0 / 5 - 0 ratings

Related issues

justusschock picture justusschock  ·  3Comments

williamFalcon picture williamFalcon  ·  3Comments

monney picture monney  ·  3Comments

as754770178 picture as754770178  ·  3Comments

versatran01 picture versatran01  ·  3Comments