Keras: Change lr without recompiling the model

Created on 10 Nov 2017  路  2Comments  路  Source: keras-team/keras

Can I change the lr without model.compile(), if I compile the model several times I run out of memory. If I use clear_session() I lose the weights.

Is there a way to change the lr without recompiling?

Most helpful comment

If I use clear_session() I lose the weights.

You could use get_weights / set_weights...

Note that you can change the learning rate at any time via K.set_value(model.optimizer.lr, value) (see code in the learning rate scheduler callback).

All 2 comments

https://keras.io/callbacks/
This should contain what you want.

If I use clear_session() I lose the weights.

You could use get_weights / set_weights...

Note that you can change the learning rate at any time via K.set_value(model.optimizer.lr, value) (see code in the learning rate scheduler callback).

Was this page helpful?
0 / 5 - 0 ratings