I think the title explains a lot. But let me elaborate, I have a LightningModule which has a configure_optimizers method returns an optimizer and a scheduler. Later in a Callback I have a on_batch_end function in which I try to log the learning rate.
Of course if the scheduler was accessible as a class member, we could self.scheduler.get_lr() on it and use the value to plot. Since this is not how it has been implemented, I am wondering how to do this?
Would appreciate any pointers.
PytorchLightning - 0.8.5
Hi! thanks for your contribution!, great first issue!
If you have the same lr throughout the network (single param group) you can get it from:
self.trainer.optimizers[0].param_groups[0]['lr']
change the indexing based on your optimizer and param configuration.
That worked, even if i have multiple groups does it work the same if I do something like this?
{f'lr_group{i}': param['lr'] for i, param in enumerate(self.trainer.optimizers[0].param_groups}
should work!
There is a LearningRateLogger callback in lightning.
@SkafteNicki mind have a look?
As @rohitgr7 mention, the LearningRateLogger which can be imported as from pytorch_lightning.callbacks import LearningRateLogger should be able to do what you ask for.
Most helpful comment
If you have the same lr throughout the network (single param group) you can get it from:
self.trainer.optimizers[0].param_groups[0]['lr']change the indexing based on your optimizer and param configuration.