Keras: using keras.layers.CuDNNLSTM: Bidirectional Wrapper Error

Created on 17 Oct 2017  路  6Comments  路  Source: keras-team/keras

I know the layer has just been added with limited features. The CuDNNLSTM layer works absolutely fine. But does it work with the Bidirectional wrapper?

OS: Windows10
Conda env
Keras version: master (2.08+)
Tensorflow backend version: tf_nightly_gpu-1.5.0

GPU: Geforce GTX 1060 (6GB)
Cuda version: v8.0.61
cuDNN version: cudnn-8.0-windows10-x64-v5.1

def return_model(layers, dropouts):
    inputs = Input(shape=(layers[0], 1), name='main_input')
    lstm = Bidirectional(CuDNNLSTM(layers[1]))(inputs)
    lstm = Dropout(dropouts[0])(lstm)
    output = TimeDistributed(Dense(1, activation='linear'))(lstm)

    model = Model(inputs=[inputs], outputs=[output])

    model.compile(loss="mse", optimizer="adam")
    print(model.summary())
    return model

if __name__=='__main__':
    model = return_model([48,256],[0.3])

I am getting the following error

def return_model(layers, dropouts):
        inputs = Input(shape=(layers[0], 1), name='main_input')
--->    lstm = Bidirectional(CuDNNLSTM(layers[1]))(inputs)
        lstm = Dropout(dropouts[0])(lstm)
        output = TimeDistributed(Dense(1, activation='linear'))(lstm)

~\AppData\Local\conda\conda\envs\keras-tf-gpu-dev\lib\site-packages\keras-2.0.8-py3.5.egg\keras\layers\wrappers.py in __init__(self, layer, merge_mode, weights, **kwargs)
             self.forward_layer = copy.copy(layer)
             config = layer.get_config()
-->          config['go_backwards'] = not config['go_backwards']
             self.backward_layer = layer.__class__.from_config(config)
             self.forward_layer.name = 'forward_' + self.forward_layer.name

KeyError: 'go_backwards'

Most helpful comment

This is fixed in 2.1.0- you can natively use the Bidirectional Wrapper on CuDNNRNNS now . Please close the issue

All 6 comments

Adding go_backwards arguments to CuDNNRNN and CuDNNLSTM + commenting out part of the wrapper that determines the training phase fixes this for now (this will break dropout on normal LSTM layer!). This is just a temporary fix until we get dropout on CuDNNRNN layers though!

@tRosenflanz Thank you!! So, this fix by modifying the keras wrapper ah? Hope they bring out a fix ... cause CuDNNLSTMs are too good :p.

It it known when this fix will be implemented in the master-branch?

Thank you

v 2.0.9 released. But I think it's not fixed.

any expected timeline for this fix?

This is fixed in 2.1.0- you can natively use the Bidirectional Wrapper on CuDNNRNNS now . Please close the issue

Was this page helpful?
0 / 5 - 0 ratings

Related issues

amityaffliction picture amityaffliction  路  3Comments

LuCeHe picture LuCeHe  路  3Comments

braingineer picture braingineer  路  3Comments

KeironO picture KeironO  路  3Comments

nryant picture nryant  路  3Comments