Keras: Bidirectional LSTM gives error when input shape is given

Created on 31 Aug 2016  Â·  1Comment  Â·  Source: keras-team/keras

While using this model

model = Sequential()
model.add(Bidirectional(LSTM(128, input_shape=(1,32))))
model.add(Dense(2))
model.add(Activation('softmax'))

gives the error

Exception: The first layer in a Sequential model must get an `input_shape` or `batch_input_shape` argument.

Most helpful comment

You should pass the input shape to the Bidirectional layer (i.e. your first
layer), not the LSTM inside.

On 31 August 2016 at 07:18, Rajath Kumar M P [email protected]
wrote:

While using this model

model = Sequential()
model.add(Bidirectional(LSTM(128, input_shape=(1,32))))
model.add(Dense(2))
model.add(Activation('softmax'))

gives the error

Exception: The first layer in a Sequential model must get an input_shape or batch_input_shape argument.

—
You are receiving this because you are subscribed to this thread.
Reply to this email directly, view it on GitHub
https://github.com/fchollet/keras/issues/3645, or mute the thread
https://github.com/notifications/unsubscribe-auth/AArWb97UaE7fo5SxsZkfmmaB0b6h-Umpks5qlY1LgaJpZM4Jxqh_
.

>All comments

You should pass the input shape to the Bidirectional layer (i.e. your first
layer), not the LSTM inside.

On 31 August 2016 at 07:18, Rajath Kumar M P [email protected]
wrote:

While using this model

model = Sequential()
model.add(Bidirectional(LSTM(128, input_shape=(1,32))))
model.add(Dense(2))
model.add(Activation('softmax'))

gives the error

Exception: The first layer in a Sequential model must get an input_shape or batch_input_shape argument.

—
You are receiving this because you are subscribed to this thread.
Reply to this email directly, view it on GitHub
https://github.com/fchollet/keras/issues/3645, or mute the thread
https://github.com/notifications/unsubscribe-auth/AArWb97UaE7fo5SxsZkfmmaB0b6h-Umpks5qlY1LgaJpZM4Jxqh_
.

Was this page helpful?
0 / 5 - 0 ratings