@aquariusjay
Hi Jay,
I want to modify the loss function in train_utils.py, for example, change "tf.losses.softmax_cross_entropy()" to "tf.losses.softmax_cross_entropy() + sigmoid_cross_entropy_with_logits()", but I didn't see the assignment operation like "loss = tf.losses.softmax_cross_entropy()", so I didn't know how to modify the loss function. Could you please help me? Many thanks for your help.
Hi lizleo,
Thanks for bringing up this issue.
I think you could do something similar to what we do here:
Specifically, you first compute the losses for softmax and for sigmoid respectively, and then you call tf.losses.add_loss(loss_sum).
Cheers,
Hi @aquariusjay ,
Thanks for your reply. Now I know how to modify the loss function. Thanks again for your help.
Best,
Sure thing. Glad to know that you figure it out.
Cheers,
Most helpful comment
Hi lizleo,
Thanks for bringing up this issue.
I think you could do something similar to what we do here:
Specifically, you first compute the losses for softmax and for sigmoid respectively, and then you call tf.losses.add_loss(loss_sum).
Cheers,