Pytorch-lightning: Plotting learning rate from a lr_scheduler via a Callback

Created on 22 Jul 2020  路  7Comments  路  Source: PyTorchLightning/pytorch-lightning

I think the title explains a lot. But let me elaborate, I have a LightningModule which has a configure_optimizers method returns an optimizer and a scheduler. Later in a Callback I have a on_batch_end function in which I try to log the learning rate.

Of course if the scheduler was accessible as a class member, we could self.scheduler.get_lr() on it and use the value to plot. Since this is not how it has been implemented, I am wondering how to do this?

Would appreciate any pointers.
PytorchLightning - 0.8.5

enhancement good first issue question

Most helpful comment

If you have the same lr throughout the network (single param group) you can get it from:
self.trainer.optimizers[0].param_groups[0]['lr']
change the indexing based on your optimizer and param configuration.

All 7 comments

Hi! thanks for your contribution!, great first issue!

If you have the same lr throughout the network (single param group) you can get it from:
self.trainer.optimizers[0].param_groups[0]['lr']
change the indexing based on your optimizer and param configuration.

That worked, even if i have multiple groups does it work the same if I do something like this?

{f'lr_group{i}': param['lr'] for i, param in enumerate(self.trainer.optimizers[0].param_groups}

should work!

There is a LearningRateLogger callback in lightning.

@SkafteNicki mind have a look?

As @rohitgr7 mention, the LearningRateLogger which can be imported as from pytorch_lightning.callbacks import LearningRateLogger should be able to do what you ask for.

Was this page helpful?
0 / 5 - 0 ratings

Related issues

awaelchli picture awaelchli  路  3Comments

remisphere picture remisphere  路  3Comments

srush picture srush  路  3Comments

monney picture monney  路  3Comments

edenlightning picture edenlightning  路  3Comments