Assuming that PR https://github.com/PyTorchLightning/pytorch-lightning/pull/941 is being merged, this should be as simple as adding this method to trainer:
@property
def lr(self):
''' Returns current learning rate for schedulers '''
if self.schedulers == []:
raise ValueError('No learning rate schedulers initialized')
else:
return [s['scheduler'].get_lr() for s in self.schedulers]
@SkafteNicki Sure its simple, i just wanted to discuss thoroughly in an issue so that my PR will not be closed again :smile:
(Sending it in slack channel didn't get any feedback.)
Here's my implementation (for me if my get_lr in scheduler is called, the learning rate will change, so I used the ['lr'] in optimizer instead)
def get_lr(self):
for param_group in self.trainer.optimizers[0].param_groups:
return param_group['lr']
Sorry, I forgot that pytorch have updated their interface. Instead of get_lr(), the lr can be extracted with get_last_lr() (without changing the actual lr). So a revised version of my initial proposal would be:
@property
def get_lr(self):
''' Returns current learning rate for schedulers '''
if self.schedulers == []:
raise ValueError('No learning rate schedulers initialized')
else:
return [s['scheduler'].get_last_lr() for s in self.schedulers]
i’m not sure this is general enough. why wouldn’t you just add a hook to get the lrs yourself?
Fine. I've added myself. Just wondering if you'd like to add it to the trainer.
i would say use the callback system for that :)