Models: [Deeplab v3+] Modify Loss function

Created on 26 Mar 2019  路  3Comments  路  Source: tensorflow/models

@aquariusjay
Hi Jay,
I want to modify the loss function in train_utils.py, for example, change "tf.losses.softmax_cross_entropy()" to "tf.losses.softmax_cross_entropy() + sigmoid_cross_entropy_with_logits()", but I didn't see the assignment operation like "loss = tf.losses.softmax_cross_entropy()", so I didn't know how to modify the loss function. Could you please help me? Many thanks for your help.

Most helpful comment

Hi lizleo,

Thanks for bringing up this issue.
I think you could do something similar to what we do here:
Specifically, you first compute the losses for softmax and for sigmoid respectively, and then you call tf.losses.add_loss(loss_sum).

Cheers,

All 3 comments

Hi lizleo,

Thanks for bringing up this issue.
I think you could do something similar to what we do here:
Specifically, you first compute the losses for softmax and for sigmoid respectively, and then you call tf.losses.add_loss(loss_sum).

Cheers,

Hi @aquariusjay ,

Thanks for your reply. Now I know how to modify the loss function. Thanks again for your help.

Best,

Sure thing. Glad to know that you figure it out.

Cheers,

Was this page helpful?
0 / 5 - 0 ratings