When use Trainer class to load the model state instead of using load_from_checkpoint, the hook on_load_checkpoint won't be called. It's better to let trainer call load_from_checkpoint instead of directly call model.load_state_dict, thus on_load_checkpoint and other hooks can be called.
I need to do some sanity check of saved hparams and I found the suitable location for this is on_load_checkpoint. But this method is not called when use Trainer to load states.
to me it makes sense call on_load_checkpoint inside load_from_checkpoint :]
@jeffling @hadim ano thoughts?
This is a good change, but we may want to think about the versioning here since it may break some codebases.
@PyTorchLightning/core-contributors any other thoughts?
I just hit this issue, and would love it in the next release!