Keras: higher val_loss have higher val_loss, why?

Created on 24 Aug 2016  路  4Comments  路  Source: keras-team/keras

with the training:
Q1: train_acc keep increase but the val_acc oscillation
Q2: higher val_loss have higher val_loss

stale

Most helpful comment

Smaller loss does not necessarily mean higher accuracy, it can mean that the same amount of correct predictions were made with higher confidence. This also means that the accuracy can actually increase a bit if the loss increases as well.

All 4 comments

Q1:
Obviously training accuracy will keep increasing (see overfitting) as that's the data the model uses for weight updates.

Validation accuracy will typically oscillate heavily if the validation data is too small and dissimilar to the training data, but you should always expect it to oscillate. Average the validation accuracy curve and look for positive trends instead.

Q2:
Higher val_loss have higher val_loss? Care to elaborate a bit?

@carlthome
Thanks for your answer!
about Q2:
usually lower val_loss have higher accuracy,

but i meet higher val_loss have higher accuracy.
like this:
epoch 55 val_loss:1.3 accuracy: 85%
epoch 56 val_loss:1.4 accuracy :86%

Smaller loss does not necessarily mean higher accuracy, it can mean that the same amount of correct predictions were made with higher confidence. This also means that the accuracy can actually increase a bit if the loss increases as well.

@alex-j-j
Thank you very much!

Was this page helpful?
0 / 5 - 0 ratings