In my custom loss function I'm trying to use K.round(y_pred) as part of my calculation. This gives me a ValueError: None values not supported. When I remove the K.round() it all works perfectly. Does anyone know what the problem might be? Thank you very much.
Here is the full error log:
Traceback (most recent call last):
File "explore_network_structure.py", line 141, in <module>
history = model.fit(x_train, y_train, nb_epoch=2000, batch_size=batch_size, shuffle=True, verbose=1, class_weight=[class_weight_A, class_weight_B], sample_weight=train_weight, callbacks=[csv_logger, early_stopping])
File "/usr/local/lib/python2.7/dist-packages/keras/models.py", line 620, in fit
sample_weight=sample_weight)
File "/usr/local/lib/python2.7/dist-packages/keras/engine/training.py", line 1079, in fit
self._make_train_function()
File "/usr/local/lib/python2.7/dist-packages/keras/engine/training.py", line 696, in _make_train_function
self.total_loss)
File "/usr/local/lib/python2.7/dist-packages/keras/optimizers.py", line 154, in get_updates
v = self.momentum * m - lr * g # velocity
File "/usr/local/lib/python2.7/dist-packages/tensorflow/python/ops/math_ops.py", line 750, in binary_op_wrapper
y = ops.convert_to_tensor(y, dtype=x.dtype.base_dtype, name="y")
File "/usr/local/lib/python2.7/dist-packages/tensorflow/python/framework/ops.py", line 657, in convert_to_tensor
ret = conversion_func(value, dtype=dtype, name=name, as_ref=as_ref)
File "/usr/local/lib/python2.7/dist-packages/tensorflow/python/framework/constant_op.py", line 180, in _constant_tensor_conversion_function
return constant(v, dtype=dtype, name=name)
File "/usr/local/lib/python2.7/dist-packages/tensorflow/python/framework/constant_op.py", line 163, in constant
tensor_util.make_tensor_proto(value, dtype=dtype, shape=shape))
File "/usr/local/lib/python2.7/dist-packages/tensorflow/python/framework/tensor_util.py", line 346, in make_tensor_proto
raise ValueError("None values not supported.")
ValueError: None values not supported.
Could you please share whole code?
Here is my custom loss:
def func_custom_loss(y_true, y_pred):
return K.mean(y_pred[:,0], axis=-1)
Currently it is working fine but if I replace y_pred[:,0] with K.round(y_pred[:,0]) it will spit the above error.
The issue is that the gradient of round is 0. In tensorflow, it's None. Maybe replace None by 0 in the optimizers?
@oxrider Have you solved this problem?I got a same problem,replace the gradient None by 0 not work.
@ops.RegisterGradient("Round")
def _RoundGrad(_, unused_grad):
return [None]
Tf crew will not fix it.
https://github.com/tensorflow/tensorflow/issues/783
There is a workaround provided, change the keras code in K.gradients in tensorflow backend.
But, it may kill your gradient since the gradient will always be 0.
EDIT: FOund this link that may help you.
http://stackoverflow.com/questions/41780344/gradient-of-tf-floor-is-none
In tf,the gradient of round()is none
This issue has been automatically marked as stale because it has not had recent activity. It will be closed after 30 days if no further activity occurs, but feel free to re-open a closed issue if needed.
@yushuinanrong why up? K.round is not differentiable, you cannot use it in a loss function.
@Dref360
Hi,
Yes. I realize that now. I've deleted my up. Thanks!
Hey!
Getting the same error:
ValueError: None values not supported.
I cannot figure out the source of problem in my loss function (basically a weighted loss function):
cumerr = K.cumsum(tf.multiply(K.cast(K.not_equal(K.sign(y_pred), K.sign(y_true)), 'float32'), K.abs(y_true)))
I guess K.abs or K.sign could be a problem if a '0' is encountered. Also, if it is, will it be managed by what @QuantumLiu suggested?
Thanks!
Prett sure K.sign is not differentiable
Most helpful comment
@yushuinanrong why up? K.round is not differentiable, you cannot use it in a loss function.