Hi,
I am trying to change loss weight during training. When i check source code, loss weight is set during compiling. When i call fit, compile process is over. Is there any easy way to accomplish this target easily. I saw some issues related with learning rate but changing code "as in learning rate example" can affect other parts.
As i have limited experience on theano and keras, I need your help. Thanks (keras version 1.1.1)
I suppose you are referring to the loss_weight argument in compile. There are two ways you could do this:
loss_weight argument value when you want to adjust the loss weights.loss_weight values, and change their values during training via a callback.@fchollet there seems to be a bug in there. Changing the loss_weights in the middle of the training seems to have no effect and the training continues with the initial weights. following is an snippet of the code I used to test loss_weights update. it successfully update the values of alpha and beta but this has no effect on the training. Also I compile the model with the updated weights but still no effect...
input_1 = Input(shape=(m,))
hidden_0 = Dense(units=10, activation='relu')(input_1)
predictions = Dense(2, activation='softmax')(hidden_0 )
sgd = SGD(lr=0.01, decay=1e-6, momentum=0.9, nesterov=True)
model = Model(inputs=input_1, outputs=[predictions, predictions])
alpha = K.variable(1.0)
beta = K.variable(0.0)
model.compile(optimizer=sgd,
loss=['categorical_crossentropy', 'categorical_crossentropy'], loss_weights=[alpha, beta],
metrics=['accuracy'])
class CustomValidationLoss(Callback):
def __init__(self, alpha, beta):
self.alpha = alpha
self.beta = beta
def on_epoch_end(self, epoch, logs={}):
if epoch == 1:
print "in model loss weight set"
self.alpha = self.alpha * 0.0
self.beta = self.beta + 1.0
print (epoch, K.get_value(self.alpha), K.get_value(self.beta))
model.compile(optimizer=sgd,
loss=['categorical_crossentropy', 'categorical_crossentropy'], loss_weights=[self.alpha, self.beta],
metrics=['accuracy'])
sys.stdout.flush()
data = shuffle(pd.read_csv(os.path.join(dir_path, 'train_data.csv')))
y_mse = data['SOFT_LABEL'].values
y_mse = np.vstack([1 - y_mse, y_mse]).T
y = to_categorical(data['LABEL'].values, 2)
X = data.values
custom_validation_loss = CustomValidationLoss(alpha, beta)
model.fit(X, [y, y_mse],
epochs=10,
batch_size=1024,
verbose=2, callbacks=[custom_validation_loss])
@tofigh- have you been able to figure out how to do it?
Here's the solution to this problem: #2595
I couldn't save my model using the solution in #2595 due to JSON serialization error. See my workaround described in #9444 if you experience the same problem.
Hi,
I am very new to keras. Hope my question don't bother too much :)
@fchollet
(simple) just recompile your model with a new loss_weight argument value when you want to adjust the loss weights.
Just want to make very sure, when the model are recompiled are the learned weights saved in RAM?
Ask so, because I am passing the a changing number to the loss function in a for loop.
e.g.
for epoch in range(len(100)):
self.model.compile(optimizer=optimizer, loss= self.build_loss(**epoch = epoch**))
m_loss = self.model.fit(x=[self.Z_full_transformed], y = [self.Z_full_transformed], epochs = 100, verbose = 0, batch_size=np.array(self.X_train).shape[0], shuffle= False)
In this way, will weights be "saved" for the self.model?
P.s. Thanks for everything you did for the keras package, it's wonderful and saved me a lot of time.
Best,
J
TypeError: ('Not JSON Serializable:')
This error comes while saving the model. Any updates on this?
https://github.com/keras-team/keras/issues/9444
Most helpful comment
I suppose you are referring to the
loss_weightargument incompile. There are two ways you could do this:loss_weightargument value when you want to adjust the loss weights.loss_weightvalues, and change their values during training via a callback.