Lightgbm: Does LGB support dynamic learning rate?

Created on 9 Nov 2020  ·  2Comments  ·  Source: microsoft/LightGBM

Does LGB support dynamic learning rate?
Can lr be smaller and smaller during the training process?

And I just updated lgb from 2.3.1 to 3.0.0, does the old param "max_position" equal to "lambdarank_truncation_level"?

question

Most helpful comment

Does LGB support dynamic learning rate?

Yes, it does.

learning_rates (list, callable or None, optional (default=None)) – List of learning rates for each boosting round or a customized function that calculates learning_rate in terms of current number of round (e.g. yields learning rate decay).
https://lightgbm.readthedocs.io/en/latest/pythonapi/lightgbm.train.html#lightgbm.train

does the old param "max_position" equal to "lambdarank_truncation_level"?

Yes, formerly this param was named max_position. Please refer to https://github.com/microsoft/LightGBM/pull/2801/files.

All 2 comments

Does LGB support dynamic learning rate?

Yes, it does.

learning_rates (list, callable or None, optional (default=None)) – List of learning rates for each boosting round or a customized function that calculates learning_rate in terms of current number of round (e.g. yields learning rate decay).
https://lightgbm.readthedocs.io/en/latest/pythonapi/lightgbm.train.html#lightgbm.train

does the old param "max_position" equal to "lambdarank_truncation_level"?

Yes, formerly this param was named max_position. Please refer to https://github.com/microsoft/LightGBM/pull/2801/files.

我最近也遇到了这个问题,感谢

Was this page helpful?
0 / 5 - 0 ratings