While using this model
model = Sequential()
model.add(Bidirectional(LSTM(128, input_shape=(1,32))))
model.add(Dense(2))
model.add(Activation('softmax'))
gives the error
Exception: The first layer in a Sequential model must get an `input_shape` or `batch_input_shape` argument.
You should pass the input shape to the Bidirectional layer (i.e. your first
layer), not the LSTM inside.
On 31 August 2016 at 07:18, Rajath Kumar M P [email protected]
wrote:
While using this model
model = Sequential()
model.add(Bidirectional(LSTM(128, input_shape=(1,32))))
model.add(Dense(2))
model.add(Activation('softmax'))gives the error
Exception: The first layer in a Sequential model must get an
input_shapeorbatch_input_shapeargument.—
You are receiving this because you are subscribed to this thread.
Reply to this email directly, view it on GitHub
https://github.com/fchollet/keras/issues/3645, or mute the thread
https://github.com/notifications/unsubscribe-auth/AArWb97UaE7fo5SxsZkfmmaB0b6h-Umpks5qlY1LgaJpZM4Jxqh_
.
Most helpful comment
You should pass the input shape to the Bidirectional layer (i.e. your first
layer), not the LSTM inside.
On 31 August 2016 at 07:18, Rajath Kumar M P [email protected]
wrote: