Pytorch-lightning: Logging the learning rate

Created on 21 Mar 2020  路  7Comments  路  Source: PyTorchLightning/pytorch-lightning

Hey,

I think it would a cool feature to add a flag enabling the logging of the learning rate(s).

Thanks for your amazing work !

discussion enhancement help wanted

All 7 comments

Hi! thanks for your contribution!, great first issue!

I think that's a great idea. Maybe it doesn't have to be a flag, it could be done by default like the other metrics that are already plotted automatically.

Some things to consider:

  • How would it work for optimizers like Adam?
  • Optimizers may have different learning rates for different param groups

For Adam it's a pickle. Maybe rather log the scheduler information i.e. its scaling of the initial learning rate. It would solve the group problems as well I guess.

@PyTorchLightning/core-contributors do we want to add extra logging for LR or just stay with logging these extra parameters as a metric...?

I proposed adding a Trainer.lr in #1003, but we decided to use the callbacks then.

I'd also stick to callbacks.
The simplest approach to train a network doesn't even include lr changes and it does not make any sense to log something that doesn't change by design.

However, for convenience we could provide an implementation of such a callback

I have earlier implemented a callback that logs the learning rate for some experiments I did. I can bring it up to date and create a PR if wanted.

Was this page helpful?
0 / 5 - 0 ratings

Related issues

anthonytec2 picture anthonytec2  路  3Comments

mmsamiei picture mmsamiei  路  3Comments

srush picture srush  路  3Comments

monney picture monney  路  3Comments

chuong98 picture chuong98  路  3Comments