Keras: Serialisation doesn't allow variables as loss_weights

Created on 21 Feb 2018  路  8Comments  路  Source: keras-team/keras

I've been training a model with multiple losses, where the loss weights need updating during training via a callback. This is working fine, except when I try to save the model, when I get an error:

TypeError: ('Not JSON Serializable:', )

I've written two test cases, one passing, one failing, to demonstrate:
test cases

I'm running on OSX, Python 3.6.4, Tensorflow backend, CPU only. I freshly installed everything for the test.
pip list:

absl-py (0.1.10)
appdirs (1.4.3)
attrs (17.4.0)
bleach (1.5.0)
h5py (2.7.1)
html5lib (0.9999999)
Keras (2.1.4)
Markdown (2.6.11)
numpy (1.14.1)
packaging (16.8)
pip (9.0.1)
pluggy (0.6.0)
protobuf (3.5.1)
py (1.5.2)
pyparsing (2.2.0)
pytest (3.4.1)
PyYAML (3.12)
scipy (1.0.0)
setuptools (38.5.1)
six (1.10.0)
tensorflow (1.5.0)
tensorflow-tensorboard (1.5.1)
Werkzeug (0.14.1)
wheel (0.30.0)

To investigate

Most helpful comment

I experienced the same issue but it seems I've figured out a workaround. Define a custom loss function with an additional argument using a closure as described in comments in issue 2121. You will set this function as a loss for your model and pass a K.variable as an argument to the function. That will be your loss weight. K.variable is updated using a custom callback, as described in issue 2595.

To give you a more clear idea:

def dice_loss(training_mask, loss_weight):
    def loss(y_true, y_pred):
        eps = 1e-5
        intersection = tf.reduce_sum(y_true * y_pred * training_mask)
        union = tf.reduce_sum(y_true * training_mask) + tf.reduce_sum(y_pred * training_mask) + eps
        loss = 1. - (2. * intersection / union)
        return loss * loss_weight
    return loss

score_map_loss_weight = K.variable(1.)
loss_weight = LossWeight(score_map_loss_weight) # this is your custom callback

model.compile(loss=[dice_loss(training_mask, score_map_loss_weight), rbox_loss(training_mask)], optimizer=opt)

All 8 comments

I experienced the same issue but it seems I've figured out a workaround. Define a custom loss function with an additional argument using a closure as described in comments in issue 2121. You will set this function as a loss for your model and pass a K.variable as an argument to the function. That will be your loss weight. K.variable is updated using a custom callback, as described in issue 2595.

To give you a more clear idea:

def dice_loss(training_mask, loss_weight):
    def loss(y_true, y_pred):
        eps = 1e-5
        intersection = tf.reduce_sum(y_true * y_pred * training_mask)
        union = tf.reduce_sum(y_true * training_mask) + tf.reduce_sum(y_pred * training_mask) + eps
        loss = 1. - (2. * intersection / union)
        return loss * loss_weight
    return loss

score_map_loss_weight = K.variable(1.)
loss_weight = LossWeight(score_map_loss_weight) # this is your custom callback

model.compile(loss=[dice_loss(training_mask, score_map_loss_weight), rbox_loss(training_mask)], optimizer=opt)

@Dref360 has there been any more investigation into this, or is the only current workaround the solution that @kurapan has proposed?

@kurapan thanks for that suggested fix, it worked for me!

I had the same problem. I did a very dirty fix, add a few lines of code in saving model before

raise TypeError('Not JSON Serializable:', obj)

    else:
        return float(obj.get_value() or 0)

I hope someone could figure out a better solution.

Is there any solution for this bug? I think should keras people should fix it in later versions. Is there any possibility to update this issue.

if the 'loss_weights' arguments works with variable, it will be better than using a customize loss function

@fchollet Could really use a fix for this! Thanks.

would need that fix too. The Workaround from @kurapan does work, however it implies that if one still want to log the unweighted loss values one need to introduce additional metrics.

It is not just throwing this error when trying to save the model at the end of an epoch, the loss_weights do nothing to the total loss. My loss weights for two output network is {'out1': alpha, 'out2': 1-alpah} and alpha begins with 0 and ends to 1 by using some update equation which depends on the epoch number and by using callback of course. so the loss for 'out1' should be zero for the first epoch , but both losses are working without any multiplication with the loss_weights. The only solution for me is to build a custom loss function, but I still need this issue to be fixed because you want to use the build-in keras losses and sometimes it is hard to build the same loss by yourself. So is there

Was this page helpful?
0 / 5 - 0 ratings

Related issues

fredtcaroli picture fredtcaroli  路  3Comments

zygmuntz picture zygmuntz  路  3Comments

braingineer picture braingineer  路  3Comments

yil8 picture yil8  路  3Comments

LuCeHe picture LuCeHe  路  3Comments