Hi all,
I'm wondering if there are any easy ways to access the internal states of LSTM, i.e. the c, for all the time steps of each input sequence? I can get the h by setting return_sequences=True, and I know h = o * self.activation(c). But I can not figure out an easy way to access the c.
Thanks!
You could probably access self.states
from a LSTM layer. Check the source for inspiration: https://github.com/fchollet/keras/blob/master/keras/layers/recurrent.py
But you have to use stateful=True, which will change the trainng scheme, otherwise the self.states would be None, right?
Sorry, I misspoke. I sometimes hit comment too fast =).
_It is hard to have access to all of the timesteps while in the process of making the timesteps_
If you _just_ want them passed out as the h
already is, then the previous answer is correct. Nevermind me..
To answer your question, I don't think the training changes much if you have stateful on. You should reset the states every batch, that's about it.
It could even be as simple as...
class modLSTM(LSTM):
def call(self, x, mask=None):
if self.stateful:
self.reset_states()
return super(modLSTM, self).call(x, mask)
Most helpful comment
Sorry, I misspoke. I sometimes hit comment too fast =).
_It is hard to have access to all of the timesteps while in the process of making the timesteps_
If you _just_ want them passed out as the
h
already is, then the previous answer is correct. Nevermind me..To answer your question, I don't think the training changes much if you have stateful on. You should reset the states every batch, that's about it.
It could even be as simple as...