I'm using Keras to train a model with regularization.
On each epoch, the loss is printed but it's the sum of both the loss and regularization term. How can I get it to print those separately instead of their sum?
You add the loss as a metric, e.g.:
from keras import metrics
model.compile(..., metrics=[metrics.categorical_crossentropy])
On 1 May 2016 at 14:18, benjaminklein [email protected] wrote:
I'm using Keras to train a model with regularization.
On each epoch, the loss is printed but it's the sum of both the loss and
regularization term. How can I get it to print those separately instead of
their sum?—
You are receiving this because you are subscribed to this thread.
Reply to this email directly or view it on GitHub
https://github.com/fchollet/keras/issues/2575
Wouldn't that (putting the loss function as a metric also) cause duplicate calculation of the metric?
Perhaps it would be better if the regularization losses would have their own standard term in logs dict, since otherwise it's pretty confusing.
Especially when training multiple tasks- the progbar prints each task-loss and I expected the loss to be the sum, but it wasn't, it was the sum of all the losses and the regularization loss.
Thanks
Most helpful comment
You add the loss as a metric, e.g.:
from keras import metrics
model.compile(..., metrics=[metrics.categorical_crossentropy])
On 1 May 2016 at 14:18, benjaminklein [email protected] wrote: