Keras crashes whenever you use the TensorBoard callback with histograms over multiple runs, the first run is always successful, but the second run fails right after the first epoch(when callbacks are called).
Code for reproducibility(validation data has to be passed for TensorBoard histograms to be computed):
data = np.random.random((1000, 784))
labels = np.random.randint(2, size=(1000, 1))
val_data = np.random.random((1000, 784))
val_labels = np.random.randint(2, size=(1000, 1))
for _ in range(3):
model = Sequential()
model.add(Dense(784, input_dim=784))
model.add(Activation('relu'))
model.add(Dense(784))
model.add(Activation('relu'))
model.add(Dense(784))
model.add(Activation('relu'))
model.add(Dense(1))
model.add(Activation('sigmoid'))
model.compile(optimizer='sgd', loss='binary_crossentropy')
model.fit(data, labels, validation_data=(val_data, val_labels),
nb_epoch=10, batch_size=32, callbacks=[TensorBoard(histogram_freq=1)])
Error(a bunch of these):
W tensorflow/core/framework/op_kernel.cc:968] Invalid argument:
You must feed a value for placeholder tensor 'dense_input_1' with dtype float
[[Node: dense_input_1 = Placeholder[dtype=DT_FLOAT, shape=[], _device="/job:localhost/replica:0/task:0/gpu:0"]()]]
Hi, I had the same problem.
A simple fix is to call K.clear_session() at the beginning of the loop (also see https://github.com/fchollet/keras/issues/2102).
I suspect this is caused by generating summaries with names that have previously been used. The documentation says:
When the Op is run, it reports an InvalidArgument error if multiple values in the summaries to merge use the same tag.
Hence, clearing everything avoids the issue (at least for me).
Edit: Add from keras import backend as K at the top.
This issue still exists as of 2.0.6
Any plans to fix?
@bicubic
Now we are in Keras 2.1.5 and still this damn problem exists and it does not get resolved by K.clear_session()
clear_session is not working! it makes tensorboard unusable...
Most helpful comment
Hi, I had the same problem.
A simple fix is to call
K.clear_session()at the beginning of the loop (also see https://github.com/fchollet/keras/issues/2102).I suspect this is caused by generating summaries with names that have previously been used. The documentation says:
Hence, clearing everything avoids the issue (at least for me).
Edit: Add
from keras import backend as Kat the top.