Keras: Custom loss with multiple model outputs

Created on 26 Mar 2019  路  9Comments  路  Source: keras-team/keras

I have a model with multiple outputs from different layers: O: output from softmax layer; y1,y2: from intermediate hidden layer.

  `m = keras.models.Model (inputs=x, outputs=[O,y1,y2])`

I want to compute cross-entropy loss between O and true labels and MSE loss between y1 and y2. My custom loss function will be like:

  ```
  def Myloss(O_true, O_pred, y1, y2):
        MSE = mean_squared_error(y1, y2)
        CE = categorical_crossentropy(O_true, O_pred)
        return MSE + CE 

  ```

Now how do I pass the model outputs to
model.compile(loss='Myloss', ...)

support

Most helpful comment

Here is what I did to get to multi output thing working. suppoe your model is:
model = Model(inputs=[input1,input2], outputs=[output1,output2])
build loss function 1

def loss_fun1(y_true, y_pred):
     # do whatever you want
    loss  = tf.nn.sigmoid_cross....
    return loss

build loss function 2
````
def loss_fun2(y_true, y_pred):
# do whatever you want
loss = tf.nn.sigmoid_cross....
return loss

define loss dict

losses ={'output_layer1':loss_fun1
'output_layer2:loss_fun2
}

define loss weight

lossWeights={'output_layer1':0.5
'output_layer2:0.5
}

create placeholder for targets (keras throw error if not supplied, be careful)

target1 = tf.placeholder(dtype='int32', shape=()) # shapes of output1 your target has
target2 = tf.placeholder(dtype='float32', shape=()) # shapes of output2 your target has


model.compile(optimizer='rmsprop',
loss=losses,
loss_weights= lossWeights,
target_tensors=[target1, target2]

         )

```

All 9 comments

model.compile(loss={'o': Myloss, 'y1': Myloss, 'y2': Myloss}, optimizer='adam')
hope it helped

I am trying to do something similar. @Jamesswiz when I run the MyLoss function, I get an error saying that 'O' (O_true, O_pred) is a missing argument into the loss function.

How are the outputs of the model related to the loss, and compile functions?

@bairavi26 please post your code here.
which fit wrapping do you use? fit, fit_generator, or train_on_batch?

Please fill the issue template here. Could you update them if they are relevant in your case, or leave them as N/A? Along with the template, please provide as many details as possible to find the root cause of the issue. It would be great if you can provide a small code to reproduce the error. Thanks!

Automatically closing due to lack of recent activity. Please update the issue when new information becomes available, and we will reopen the issue. Thanks!

Here is what I did to get to multi output thing working. suppoe your model is:
model = Model(inputs=[input1,input2], outputs=[output1,output2])
build loss function 1

def loss_fun1(y_true, y_pred):
     # do whatever you want
    loss  = tf.nn.sigmoid_cross....
    return loss

build loss function 2
````
def loss_fun2(y_true, y_pred):
# do whatever you want
loss = tf.nn.sigmoid_cross....
return loss

define loss dict

losses ={'output_layer1':loss_fun1
'output_layer2:loss_fun2
}

define loss weight

lossWeights={'output_layer1':0.5
'output_layer2:0.5
}

create placeholder for targets (keras throw error if not supplied, be careful)

target1 = tf.placeholder(dtype='int32', shape=()) # shapes of output1 your target has
target2 = tf.placeholder(dtype='float32', shape=()) # shapes of output2 your target has


model.compile(optimizer='rmsprop',
loss=losses,
loss_weights= lossWeights,
target_tensors=[target1, target2]

         )

```

so the output_layer1 should be the layer's object or the name in layer(units=3,name='name')?

its 'name' of the output layer.

But if I want to write a combined loss function for both layers then what should I do ?

`def combined_loss(y_true, y_pred):

multi_pred, binary_pred = y_pred
multi_true, binary_true = y_true
multiloss = tf.losses.categorical_crossentropy(y_true, y_pred)
binary_loss = tf.losses.binary_crossentropy(binary_true, binary_pred)
#
loss = multiloss + binary_loss
return loss

`

Was this page helpful?
0 / 5 - 0 ratings