Keras: Storing hidden stated of Recurrent layer

Created on 4 Apr 2016  路  5Comments  路  Source: keras-team/keras

Does save_weights() in model also save the hidden cells and states of LSTM layer. I have to use inner cells of LSTM as a prior to another LSTM. As far as I understand we can store weights of LSTM and load it. But what about inner cell and hidden states?

stale

Most helpful comment

Nope, but you can get/set them manually:

import keras.backend as K

def get_states(model):
    return [K.get_value(s) for s,_ in model.state_updates]

def set_states(model, states):
    for (d,_), s in zip(model.state_updates, states):
        K.set_value(d, s)

All 5 comments

Nope, but you can get/set them manually:

import keras.backend as K

def get_states(model):
    return [K.get_value(s) for s,_ in model.state_updates]

def set_states(model, states):
    for (d,_), s in zip(model.state_updates, states):
        K.set_value(d, s)

@NasenSpray Is the model in your code a Sequential object? I found my model.state_updates to be an empty list. However I can access states using model.layers[0].states.

my model object is:

batch_size = 32
seq_steps = 128

model = Sequential()
model.add(
    SimpleRNN(32,
            return_sequences=True,
            batch_input_shape=(batch_size, seq_steps, 1),
            stateful=True,
            )
)
model.add(TimeDistributed(Dense(1)))
model.compile(loss='mse', optimizer='rmsprop')

@tdihp: yes, but the code is also from before Keras' big API overhaul, so not sure if it works with the current version.

@tdihp @NasenSpray - has anyone tried this on a recent version of Keras? I'd like to implement something similar now, so would be interested if the code above still works.

Was this page helpful?
0 / 5 - 0 ratings