Hello,
I am trying to manage LR scheduling with my training with pL. Main methods are relying on epoch-wise updates on LR, is there a way to reduce this to step-wise?
Hi! thanks for your contribution!, great first issue!
https://pytorch-lightning.readthedocs.io/en/latest/optimizers.html?highlight=scheduller
for example:
def configure_optimizers(self):
optimizer = Adam(...)
scheduler = {'scheduler': OneCycleLR(optimizer, ...),
'interval': 'step'}
return [optimizer], [scheduller]
I'll close this for now, feel free to re-open if you have more questions.
Also check out our forums: https://forums.pytorchlightning.ai/