using python 3.5.2 tensorflow rc 1.1
I'm trying to use a tensorflow metric function in keras. the required inteface seems to be the same, but calling:
model.compile(loss='binary_crossentropy', optimizer='adam', metrics=[tensorflow.metrics.auc])
results with the error:
Using TensorFlow backend.
Traceback (most recent call last):
File "/Users/ophir/dev/ophir/tf_keras_metrics.py", line 49, in <module>
metrics=[precision, recall, tensorflow.metrics.auc]
File "/Users/ophir/anaconda3/envs/p3/lib/python3.5/site-packages/keras/engine/training.py", line 956, in compile
metric_result = masked_metric_fn(y_true, y_pred, mask=masks[i])
File "/Users/ophir/anaconda3/envs/p3/lib/python3.5/site-packages/keras/engine/training.py", line 489, in masked
return K.mean(score_array)
File "/Users/ophir/anaconda3/envs/p3/lib/python3.5/site-packages/keras/backend/tensorflow_backend.py", line 1120, in mean
axis = _normalize_axis(axis, ndim(x))
File "/Users/ophir/anaconda3/envs/p3/lib/python3.5/site-packages/keras/backend/tensorflow_backend.py", line 437, in ndim
dims = x.get_shape()._dims
AttributeError: 'tuple' object has no attribute 'get_shape'
Process finished with exit code 1
So, tensorflow.metrics.XXX functions returns a tuple, where the first value is the float tensor holding the result. I tried passing that to keras and get an error as well.
With this change https://github.com/fchollet/keras/issues/4402 I don't get any errors. However, the returned value is always 0 regardless of the metric.
@fchollet Any chance you can look into this? There aren't many metrics available in Keras, so it would be great if we could use TensorFlow metrics.
@d4nst I'm using something like this:
@static_vars(stream_vars=None)
def auc_roc(y_true, y_pred):
value, update_op = tf.contrib.metrics.streaming_auc(
y_pred, y_true, curve='ROC', name='auc_roc')
auc_roc.stream_vars = [i for i in tf.local_variables() if i.name.split('/')[0] == 'auc_roc']
return control_flow_ops.with_dependencies([update_op], value)
Just need to add this func to the metrics list passed on model.compile
I then have a callback that resets stream_vars
in the first and last batch for all the metrics
@fchollet does the above seem correct to you?
oh and the decorator simply looks like:
def static_vars(**kwargs):
def decorate(func):
for k in kwargs:
setattr(func, k, kwargs[k])
return func
return decorate
Any news about this problem? I've changed the initialization in the backend file, but the value returned is always zero.
@dralves - I'm trying to make your suggestion work in tensorflow v1.3, and it's choking on:
return control_flow_ops.with_dependencies([update_op], value)
NameError: name 'control_flow_ops' is not defined
Any suggestions? I have a feeling it's probably deprecated, but I'm not sure how to update it. Also, since I'm not doing streaming do I need the decorator and callback to reset stream_vars
?
For now, it's sufficient to add variables, created for some tf metric, to GraphKeys.GLOBAL_VARIABLES collection. So they will be initialized by new keras session while training.
def auc_roc(y_true, y_pred):
# any tensorflow metric
value, update_op = tf.contrib.metrics.streaming_auc(y_pred, y_true)
# find all variables created for this metric
metric_vars = [i for i in tf.local_variables() if 'auc_roc' in i.name.split('/')[1]]
# Add metric variables to GLOBAL_VARIABLES collection.
# They will be initialized for new session.
for v in metric_vars:
tf.add_to_collection(tf.GraphKeys.GLOBAL_VARIABLES, v)
# force to update metric values
with tf.control_dependencies([update_op]):
value = tf.identity(value)
return value
Thanks Bogdan, that's a brilliant solution! My guess is that this would apply to using any Tensorflow method in Keras, right?
Hello, I have the same issue, and the solution of @BogdanRuzh do not work for me: it gives always zero as result. TensorFlow version 1.5, Keras version 2.1.3.
@BogdanRuzh your solution works with tf.contrib.metrics.streaming_auc() but when using tf.metrics.auc() it gives 0. Not sure why
@BogdanRuzh @pinkeshbadjatiya
I have tried the above solution. It output auc during the model training. However, the values i got from auc is really similar to accuracy. (My dataset has mostly 0)
I was trying to compare the result of auc on test set generated from model.evaluate and the auc using sklearn.metrics.auc. It turned out the result are pretty different i got 0.9 with the solution but 0.5 for sklearn.
Is that also happen to anyone?
@sherrylau the above method does not give actual AUC scores. Have a look here: https://www.tensorflow.org/api_docs/python/tf/contrib/metrics/streaming_auc
@pinkeshbadjatiya thanks for your response. I know the link you reference to mention that steaming_auc computes approximate AUC using riemann sum. I understand there's some discrepancy but 0.9 vs. 0.5 is way too different. the 0.9 i got is from model.evaluate output while 0.5 is based on output of model.predict. Is your case happen to be similar?
@sherrylau I don't have much idea about this discrepancy. Maybe others can help.
Using metrics that aren't from tf.keras.metrics
is in general not supported with tf.keras
models, sorry. In the future, we'll make tf.metrics
compatible with tf.keras
, but that's not the case today.
The wrapping trick used by @BogdanRuzh should work, though. But be aware it's a hack.
Any news on Keras supporting tf.metrics
?
@carlthome I had a recent comment on this topic at https://github.com/tensorflow/tensorflow/issues/20377#issuecomment-401933700. But a timeline was not disclosed.
When do we expect support for tf.metrics
in keras? Thanks!
Looks like in tensorflow 2 tf.metrics
is replaced by tf.keras.metrics
Most helpful comment
For now, it's sufficient to add variables, created for some tf metric, to GraphKeys.GLOBAL_VARIABLES collection. So they will be initialized by new keras session while training.