Keras: How to give loss weights for different time steps of Recurrent Neural Networks

Created on 7 Nov 2017  路  1Comment  路  Source: keras-team/keras

Hi, I am training a recurrent neural network (convlstm2d). The problem is that the ground truth is only given at sparse time steps, such as the 1st time step, 10th time step. I want to only do back propagation at those steps but not penalize at the other time steps. However I still want to keep the inputs at the other time steps (i.e., 2nd, 3rd ...). Is there a way to give a loss weight vector to the network? The loss weight vector will look like this: [1,0,0,0,0,0,0,0,0,1...]. So the weight is 1 only at the time steps which have ground truth.

I think the masking is not the solution, right? Please help!!! Thanks!

Most helpful comment

When compiling your model, use the parameter sample_weight_mode = 'temporal' and then supply sample weights when training. From the docs:

sample_weight: Optional array of the same length as x, containing weights to apply to the model's loss for each sample. In the case of temporal data, you can pass a 2D array with shape (samples, sequence_length), to apply a different weight to every timestep of every sample. In this case you should make sure to specify sample_weight_mode="temporal" in compile().

>All comments

When compiling your model, use the parameter sample_weight_mode = 'temporal' and then supply sample weights when training. From the docs:

sample_weight: Optional array of the same length as x, containing weights to apply to the model's loss for each sample. In the case of temporal data, you can pass a 2D array with shape (samples, sequence_length), to apply a different weight to every timestep of every sample. In this case you should make sure to specify sample_weight_mode="temporal" in compile().

Was this page helpful?
0 / 5 - 0 ratings