Keras: LeakyReLU activation does not work with newest release of Keras

Created on 12 Apr 2016  路  5Comments  路  Source: keras-team/keras

Hey all,

I upgraded from keras version 0.3.2 to 0.3.3 and now my code does not work anymore.
The code line that makes it break is

model.add(Convolution2D(params['k']*get_FeatureMaps(1, params['fp']), 2, 2, init='orthogonal', activation=LeakyReLU(params['a']), input_shape=input_shape[1:]))

while when I use a separated code for the activation, things are OK except that I can't use the LeakyRELU.

model.add(Convolution2D(params['k']*get_FeatureMaps(1, params['fp']), 2, 2, init='orthogonal', input_shape=input_shape[1:])) model.add(Activation('relu'))

any hint how to code a LeakyRELU(0.3)?

many thanks
Peter

Most helpful comment

PS: in case it wasn't clear, this is what you want to do:

model.add(LeakyReLU(params['a']))

All 5 comments

Can confirm this.

Thanks for the fast reply.
Any idea if and when this will be addressed?

You can use LeakyReLU as a layer, which is how it's actually meant to be used. Do not pass a layer class instance as activation argument, this argument is supposed to take _functions_. But you could also implement your own LeakyReLU _activation function_ and pass it as activation.

Advanced activation _layers_ are not activation _functions_.

PS: in case it wasn't clear, this is what you want to do:

model.add(LeakyReLU(params['a']))

thanks, that is clear!

Was this page helpful?
0 / 5 - 0 ratings