Hey,
I think it would a cool feature to add a flag enabling the logging of the learning rate(s).
Thanks for your amazing work !
Hi! thanks for your contribution!, great first issue!
I think that's a great idea. Maybe it doesn't have to be a flag, it could be done by default like the other metrics that are already plotted automatically.
Some things to consider:
For Adam it's a pickle. Maybe rather log the scheduler information i.e. its scaling of the initial learning rate. It would solve the group problems as well I guess.
@PyTorchLightning/core-contributors do we want to add extra logging for LR or just stay with logging these extra parameters as a metric...?
I proposed adding a Trainer.lr in #1003, but we decided to use the callbacks then.
I'd also stick to callbacks.
The simplest approach to train a network doesn't even include lr changes and it does not make any sense to log something that doesn't change by design.
However, for convenience we could provide an implementation of such a callback
I have earlier implemented a callback that logs the learning rate for some experiments I did. I can bring it up to date and create a PR if wanted.