Pytorch-lightning: stepwise learning rate scheduler

Created on 1 Nov 2020  路  3Comments  路  Source: PyTorchLightning/pytorch-lightning

Hello,

I am trying to manage LR scheduling with my training with pL. Main methods are relying on epoch-wise updates on LR, is there a way to reduce this to step-wise?

  • OS: [Linux]
  • Packaging [pip]
  • Version [1.0.4]
question

All 3 comments

Hi! thanks for your contribution!, great first issue!

https://pytorch-lightning.readthedocs.io/en/latest/optimizers.html?highlight=scheduller
for example:

def configure_optimizers(self):
   optimizer = Adam(...)
   scheduler = {'scheduler': OneCycleLR(optimizer, ...), 
                        'interval': 'step'}
    return [optimizer], [scheduller] 

I'll close this for now, feel free to re-open if you have more questions.

Also check out our forums: https://forums.pytorchlightning.ai/

Was this page helpful?
1 / 5 - 1 ratings

Related issues

maxime-louis picture maxime-louis  路  3Comments

mmsamiei picture mmsamiei  路  3Comments

chuong98 picture chuong98  路  3Comments

Vichoko picture Vichoko  路  3Comments

williamFalcon picture williamFalcon  路  3Comments