Dear all,
Is lr_scheduler (or an array of them) exposed to pl.LightningModule as self.scheduler or something similar ? I only see self.trainer but self.scheduler would really complement the trainer!
I tried looking for self.scheduler (and different variants) in the source / PDF manual but found nothing
Hi! thanks for your contribution!, great first issue!
it's not there yet. Mind submit a PR to add that?
Something similar for reference:
https://github.com/PyTorchLightning/pytorch-lightning/blob/7b375ed1d3ce5f55c1021038d2892dce6ae8bd66/pytorch_lightning/core/lightning.py#L117-L126
Looks like we should be accessing it via self.trainer.schedulers (rather than directly via self.schedulers) for consistency with self.trainer.optimizers. I'll submit a PR soon.
Hey, I think you can access the scheduler by using trainer.lr_schedulers, therefore maybe you could access from your pl_module by using self.trainer.lr_schedulers
@MohammedAljahdali I don't see lr_schedulers anywhere in trainer.py. Would you mind pointing me to the right location where it is defined?
@seannz it's added here:
https://github.com/PyTorchLightning/pytorch-lightning/blob/a32bffcdea3c12fb369f2cb87cbcf8b7e91396e6/pytorch_lightning/accelerators/accelerator.py#L187-L194
Hey, I think you can access the scheduler by using trainer.lr_schedulers, therefore maybe you could access from your pl_module by using self.trainer.lr_schedulers
yes you can access it via self.trainer but I think it can be exposed to LightningModule as property just like optimizers
Most helpful comment
@seannz it's added here:
https://github.com/PyTorchLightning/pytorch-lightning/blob/a32bffcdea3c12fb369f2cb87cbcf8b7e91396e6/pytorch_lightning/accelerators/accelerator.py#L187-L194
yes you can access it via
self.trainerbut I think it can be exposed toLightningModuleas property just likeoptimizers