Keras: model_from_json() in keras.models does not work for custom layers

Created on 21 Apr 2016  路  15Comments  路  Source: keras-team/keras

I have a custom layer which also has a custom regularizer of its own. After upgrading to the latest version of Keras (1.0.1) yesterday (Apr 19, 2016) I cannot load my models from json strings anymore. This holds true even for new models that I am creating after upgrading my Keras version.
Here is roughly what I am doing:
json_string = model.to_json()
json_string
This gives me the following json string:

'{"class_name": "Sequential", "config": [{"class_name": "SoftSwitch", "config": {"name": "SoftSwitch", "output_dim": 784, "trainable": true, "init": "one", "input_dtype": "float32", "input_dim": 784, "zRegularizer": {"mu": 0.001, "kappa": 0.01, "name": "SoftSwitchRegularizer"}, "batch_input_shape": [null, 784]}}, {"class_name": "Dense", "config": {"W_constraint": null, "b_constraint": null, "name": "dense_1", "activity_regularizer": null, "trainable": true, "init": "glorot_uniform", "input_dim": null, "b_regularizer": null, "W_regularizer": {"l2": 0.0010000000474974513, "name": "WeightRegularizer", "l1": 0.0}, "activation": "linear", "output_dim": 600}}, {"class_name": "Activation", "config": {"activation": "tanh", "trainable": true, "name": "activation_1"}}, {"class_name": "SoftSwitch", "config": {"name": "SoftSwitch", "output_dim": 600, "trainable": true, "init": "one", "input_dtype": "float32", "input_dim": 600, "zRegularizer": {"mu": 0.001, "kappa": 0.01, "name": "SoftSwitchRegularizer"}, "batch_input_shape": [null, 600]}}, {"class_name": "Dense", "config": {"W_constraint": null, "b_constraint": null, "name": "dense_2", "activity_regularizer": null, "trainable": true, "init": "glorot_uniform", "input_dim": null, "b_regularizer": null, "W_regularizer": {"l2": 0.0010000000474974513, "name": "WeightRegularizer", "l1": 0.0}, "activation": "linear", "output_dim": 600}}, {"class_name": "Activation", "config": {"activation": "tanh", "trainable": true, "name": "activation_2"}}, {"class_name": "SoftSwitch", "config": {"name": "SoftSwitch", "output_dim": 600, "trainable": true, "init": "one", "input_dtype": "float32", "input_dim": 600, "zRegularizer": {"mu": 0.001, "kappa": 0.01, "name": "SoftSwitchRegularizer"}, "batch_input_shape": [null, 600]}}, {"class_name": "Dense", "config": {"W_constraint": null, "b_constraint": null, "name": "dense_3", "activity_regularizer": null, "trainable": true, "init": "glorot_uniform", "input_dim": null, "b_regularizer": null, "W_regularizer": {"l2": 0.0010000000474974513, "name": "WeightRegularizer", "l1": 0.0}, "activation": "linear", "output_dim": 600}}, {"class_name": "Activation", "config": {"activation": "tanh", "trainable": true, "name": "activation_3"}}, {"class_name": "SoftSwitch", "config": {"name": "SoftSwitch", "output_dim": 600, "trainable": true, "init": "one", "input_dtype": "float32", "input_dim": 600, "zRegularizer": {"mu": 0.001, "kappa": 0.01, "name": "SoftSwitchRegularizer"}, "batch_input_shape": [null, 600]}}, {"class_name": "Dense", "config": {"W_constraint": null, "b_constraint": null, "name": "dense_4", "activity_regularizer": null, "trainable": true, "init": "glorot_uniform", "input_dim": null, "b_regularizer": null, "W_regularizer": {"l2": 0.0010000000474974513, "name": "WeightRegularizer", "l1": 0.0}, "activation": "linear", "output_dim": 10}}, {"class_name": "Activation", "config": {"activation": "softmax", "trainable": true, "name": "activation_4"}}]}'

Note my custom defined layer SoftSwitch with some attributes like zRegularizer which are not supported by a normal Layer.
Elsewhere when I try to retrieve my model from this string:
from keras.models import model_from_json
model2 = model_from_json(json_string)
I get the following error:

Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/usr/local/lib/python2.7/dist-packages/keras/models.py", line 30, in model_from_json
    return layer_from_config(config, custom_objects=custom_objects)
  File "/usr/local/lib/python2.7/dist-packages/keras/utils/layer_utils.py", line 35, in layer_from_config
    return layer_class.from_config(config['config'])
  File "/usr/local/lib/python2.7/dist-packages/keras/models.py", line 758, in from_config
    layer = layer_from_config(first_layer)
  File "/usr/local/lib/python2.7/dist-packages/keras/utils/layer_utils.py", line 34, in layer_from_config
    instantiate=False)
  File "/usr/local/lib/python2.7/dist-packages/keras/utils/generic_utils.py", line 14, in get_from_module
    str(identifier))
Exception: Invalid layer: SoftSwitch

So it seems that the method model_from_json() does not recognize my custom class. So I tried again this time passing a dictionary of my custom objects:
model2 = model_from_json(json_string,{'SoftSwitch':SoftSwitch})
This time it gave me the error:

Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/usr/local/lib/python2.7/dist-packages/keras/models.py", line 30, in model_from_json
    return layer_from_config(config, custom_objects=custom_objects)
  File "/usr/local/lib/python2.7/dist-packages/keras/utils/layer_utils.py", line 35, in layer_from_config
    return layer_class.from_config(config['config'])
  File "/usr/local/lib/python2.7/dist-packages/keras/models.py", line 758, in from_config
    layer = layer_from_config(first_layer)
  File "/usr/local/lib/python2.7/dist-packages/keras/utils/layer_utils.py", line 35, in layer_from_config
    return layer_class.from_config(config['config'])
  File "/usr/local/lib/python2.7/dist-packages/keras/engine/topology.py", line 898, in from_config
    return cls(**config)
  File "/home/nkamra/Dropbox/USC PhD Research/SwitchedNN/sim/SoftSwitch.py", line 43, in __init__
    super(SoftSwitch, self).__init__(**kwargs)
  File "/usr/local/lib/python2.7/dist-packages/keras/engine/topology.py", line 302, in __init__
    assert kwarg in allowed_kwargs, 'Keyword argument not understood: ' + kwarg
AssertionError: Keyword argument not understood: output_dim

So apparently it did not even recognize output_dim which is an attribute that even normal layers already have. I could not figure out how to pass the output_dim argument in my custom objects' dictionary (and also the zRegularizer class) and kept on getting similar errors with other custom objects returned by the get_config() method of my SoftSwitch class.

I tried to figure out what is wrong and this is my guess: I am generating a config string in the get_config() method of my SoftSwitch class as follows:

def get_config(self):
    config = {'name': self.__class__.__name__, 
        'output_dim': self.output_dim,
        'init': self.init.__name__,
        'zRegularizer': self.zRegularizer.get_config(),
        'input_dim': self.input_dim}
    base_config = super(SoftSwitch, self).get_config()
    return dict(list(base_config.items()) + list(config.items()))

This is very similar to that of the Dense layer since I copied it from there and inserted my own custom objects in the string. Note that in the end the returned dictionary contains config items from both the SoftSwitch layer and its superclass Layer. This config string is converted into the json format.
Later while reconstructing the classes from json_string, a portion of json_string is passed first to the superclass Layer as kwargs, but at that point there is no distinction between the attributes of Layer and those of SoftSwitch in json_string and hence Layer tries to create attributes like output_dim and zRegularizer which it does not have. Note that the final error is being raised by keras.topology lines 295 - 302:

allowed_kwargs = {'input_shape',
                  'batch_input_shape',
                  'input_dtype',
                  'name',
                  'trainable',
                  'create_input_layer'}
for kwarg in kwargs.keys():
    assert kwarg in allowed_kwargs, 'Keyword argument not understood: ' + kwarg

It only allows specific kwargs and since output_dim and zRegularizer are not there, it raises an exception. Can someone please tell me how to get around this issue and reconstruct all layers properly?

PS: I am new to Keras, so my understanding of the error might be incorrect. Please confirm that this is indeed the issue first and then propose some solution.

Thank you.

Most helpful comment

Just FYI model_from_json() takes two arguments: a JSON string, and a
dictionary of custom objects, such as custom layer classes. So you can do:

model = model_from_json(json_string, {'MyDense': MyDense}

All 15 comments

I'm having the same issue. Have you found any solution?

No, I do not have a solution so far. I have switched over to using pickle to save and load my models, till this gets fixed.

Same problem.

same problem~~(>_<)~~

Just FYI model_from_json() takes two arguments: a JSON string, and a
dictionary of custom objects, such as custom layer classes. So you can do:

model = model_from_json(json_string, {'MyDense': MyDense}

I am aware that model_from_json() takes two arguments and I tried using the dictionary too. It did not work out for custom layer classes (I have mentioned that in my initial post in quite a lot of detail).

Please review again.

Please review again.

I just read your original issue. It has absolutely nothing to do with JSON serialization, it is a basic implementation bug in your layer.

I assume everyone else in this thread was missing the custom_object argument, since your bug is completely specific to your own code.

I have made it: dumping and loading the customized layer! Apart from the custom_object argument, we should also override the methods of your customized layer class,such as get_config and from_config.

I experienced the same problem: custom_objects did not help. This issue is related to #3911.

Do this way saving and load again

# Save model
model_json = model.to_json()
with open("model.json", "w") as json_file:
    json_file.write(model_json)
model.save_weights("color_tensorflow_real_mode_2_imgs.h5")

##### Load the model

with open('model.json','r') as f:
    json = f.read()
model = model_from_json(json)

model.load_weights("color_tensorflow_real_mode.h5")

It will help you

Try this

from keras.models import model_from_json and then use model_from_json

Try this 

from keras.models import model_from_json and then use model_from_json

`import keras
import numpy as np
from keras.applications import vgg16, inception_v3, resnet50, mobilenet
from keras.preprocessing.image import ImageDataGenerator, img_to_array, load_img
from keras.models import Sequential, Model, load_model
from keras.layers import Dropout, Flatten, Dense
from keras import optimizers
from keras.models import model_from_json

json_file = open('model.json', 'r')
loaded_model_json = json_file.read()
json_file.close()
loaded_model = model_from_json(loaded_model_json)

load weights into new model

loaded_model.load_weights("model.h5", )
print("Loaded model from disk")`

I have used the same approach but still getting the error

with open('model_architecture.json', 'r') as f:
model = tf.keras.models.model_from_json(f.read())

Something like this would work fine in this case

If you are using tensorflow2.0 with built-in keras, try one of the lines below:

from tensorflow.python.keras.models import model_from_json

or

from tensorflow.keras.models import model_from_json

The first one works fine for me

Do this way saving and load again

# Save model
model_json = model.to_json()
with open("model.json", "w") as json_file:
    json_file.write(model_json)
model.save_weights("color_tensorflow_real_mode_2_imgs.h5")

##### Load the model

with open('model.json','r') as f:
    json = f.read()
model = model_from_json(json)

model.load_weights("color_tensorflow_real_mode.h5")

It will help you

It did. right away!

Was this page helpful?
0 / 5 - 0 ratings

Related issues

NancyZxll picture NancyZxll  路  3Comments

harishkrishnav picture harishkrishnav  路  3Comments

braingineer picture braingineer  路  3Comments

fredtcaroli picture fredtcaroli  路  3Comments

snakeztc picture snakeztc  路  3Comments