Keras: Error when cloning amodel with custom activation function

Created on 24 Dec 2017  路  9Comments  路  Source: keras-team/keras

Hello,

While I was trying to clone a model I faced the following error:

Traceback (most recent call last):

File "", line 10, in
model_RMSE = clone_model(model)

File "/usr/local/lib/python3.5/dist-packages/keras/models.py", line 1523, in clone_model
if layer not in layer_map:

File "/usr/local/lib/python3.5/dist-packages/keras/models.py", line 1378, in _clone_functional_model
merge_config = self.layers[0].get_config()

File "/usr/local/lib/python3.5/dist-packages/keras/engine/topology.py", line 1252, in from_config
# Returns

File "/usr/local/lib/python3.5/dist-packages/keras/legacy/interfaces.py", line 87, in wrapper
return func(args, *kwargs)

File "/usr/local/lib/python3.5/dist-packages/keras/layers/core.py", line 808, in __init__
super(Dense, self).__init__(**kwargs)

File "/usr/local/lib/python3.5/dist-packages/keras/activations.py", line 95, in get
if identifier is None:

File "/usr/local/lib/python3.5/dist-packages/keras/activations.py", line 87, in deserialize
def deserialize(name, custom_objects=None):

File "/usr/local/lib/python3.5/dist-packages/keras/utils/generic_utils.py", line 159, in deserialize_keras_object
if fn is None:

ValueError: Unknown activation function:exp
Here is a minimal reproducible example:

from keras.models import Model,clone_model
from keras.layers import Input, Dense
from keras.optimizers import Adam
from keras import backend as K

inputs = Input(name='input1', shape=(1,))
model = Dense(15, activation='relu') (inputs)
outputs = Dense(1, activation=K.exp) (model)
model = Model(inputs=inputs, outputs=outputs)
model.compile(loss='poisson',optimizer=Adam(lr=5e-3))
model_clone = clone_model(model)

When replacing
outputs = Dense(1, activation=K.exp) (model)
to
outputs = Dense(1, activation=None) (model)
error is vanishing.
Any idea how to solve this problem?

Most helpful comment

I still have this bug

x = keras.layers.Activation(my_custom_activation_function)(x) is not working. I have to use x = keras.layers.Lambda(my_custom_activation_function)(x) instead

why this strange behaviour?

All 9 comments

Hi! It indeed looks like a bug in keras. For a temporary fix, you could use a lambda layer maybe instead of the activation.
Lambda(lambda x: K.exp(x), lambda x:x).

This is a little more pressing now, as clone_model is used in multi_gpu_model when cpu_relocation is True.

Same issue when trying to load a saved model with exponential activation.

_Unknown activation function:exp_

I still have this bug

x = keras.layers.Activation(my_custom_activation_function)(x) is not working. I have to use x = keras.layers.Lambda(my_custom_activation_function)(x) instead

why this strange behaviour?

Would it be possible to add a custom_objects' as an input like inload_model`?

Is there anything on this bug yet ? @gabrieldemarmiesse

That's a quick workaround:

If your model has a custom activation layer like that

Dense(1, activation=K.exp)

during prediction do:

from keras import backend as K
def exponential(x):
    """Exponential (base e) activation function.
    # Arguments
        x: Input tensor.
    # Returns
        Exponential activation: `exp(x)`.
    """
    return K.exp(x)


model = model_from_json(open('/model_architecture.json').read(),
 custom_objects={'exp': Activation(exponential)})
from keras.utils import get_custom_objects
get_custom_objects().update({'some.module.func_name': func_name})

Another example:

def mish(x):
    return x * keras.backend.tanh(keras.backend.softplus(x))
get_custom_objects().update({'mish': mish})
...
model.add(Dense(.., activation=mish, ..))
Was this page helpful?
0 / 5 - 0 ratings