Hello all,
after training the model, it is saved via model.save('SSD300.h5')
. When I try to load it with model = load_model('SSD300.h5')
, I get the error:
_ValueError: Unknown layer: Normalize._
This should be then the same for the PriorBox layer. After some research on Google, I found out that you have to add the get_config method to the respective custom layers. Here is the code like I understood it from examples for the Normalize and PriorBox layers:
def get_config(self):
config = {'scale': self.scale}
base_config = super(Normalize, self).get_config()
return dict(list(base_config.items()) + list(config.items()))
def get_config(self):
config = {'img_size': self.img_size,
'min_size': self.min_size,
'max_size': self.max_size,
'aspect_ratios': self.aspect_ratios,
'variances': self.variances,
'clip': self.clip
}
base_config = super(PriorBox, self).get_config()
return dict(list(base_config.items()) + list(config.items()))
When you want to load the model with custom layers, you add a dictionary for loading custom objects to the load_model() call:
model = load_model('SSD300.h5', custom_objects={'Normalize': Normalize, 'PriorBox': PriorBox})
When I then load the model, I get the following error:
_ValueError: can only convert an array of size 1 to a Python scalar_
From what I understood, this is coming from the numpy arrays of aspect_ratios and variances in the PriorBox layers. Those can not be converted to an Python scalar.
Can maybe someone help me with this problem?
Thank you.
Hi @emillion92
you can set them as default..
that's what i did..
def __init__(...)
...
self.variance = np.array([0.1, 0.1, 0.2, 0.2])
just don't forget when you load the model, to have the same value you train with..
@emillion92 were you able to get it working? I am running into one error after the other. Have gone back to saving weights separately for now, but do share what worked for you.
@aishanou unfortunately I am also running into errors and decided then to go back to store both separately. I hope someone can come up with a working solution to help us :)
any progress?
Most helpful comment
@aishanou unfortunately I am also running into errors and decided then to go back to store both separately. I hope someone can come up with a working solution to help us :)