Hi,
Following the workflow Keras as a simplified interface to TensorFlow: tutorial
When I try to run this code:
import tensorflow as tf
from keras import backend as K
from keras.layers import Dense
from keras.objectives import categorical_crossentropy
from tensorflow.examples.tutorials.mnist import input_data
sess = tf.Session()
K.set_session(sess)
img = tf.placeholder(tf.float32, shape=(None, 784))
labels = tf.placeholder(tf.float32, shape=(None, 10))
x = Dense(128, activation='relu')(img)
x = Dense(128, activation='relu')(x)
preds = Dense(10, activation='softmax')(x)
loss = tf.reduce_mean(categorical_crossentropy(labels, preds))
mnist_data = input_data.read_data_sets('MNIST_data', one_hot=True)
train_step = tf.train.GradientDescentOptimizer(0.5).minimize(loss)
#sess.run(tf.global_variables_initializer())
with sess.as_default():
for i in range(100):
batch = mnist_data.train.next_batch(50)
train_step.run(feed_dict={img: batch[0],
labels: batch[1]})
I got the error:
tensorflow.python.framework.errors_impl.FailedPreconditionError: Attempting to use uninitialized value dense_1_W
If I uncomment sess.run(tf.global_variables_initializer()), it works.
So, I think the error is caused because of K.set_session(sess) not working.
Is this the bug or just the miss?
Thank you!
Thanks, @makora9143 ! I was observing the same error and your suggestion resolves this issue for me.
Thank You it worked for me as well.
This issue has been automatically marked as stale because it has not had recent activity. It will be closed after 30 days if no further activity occurs, but feel free to re-open a closed issue if needed.
@makora9143 resolved the issue, and now i am facing issue with keras model.predit(). Its returning the same values.
Faced similar problem with constant predictions, when wanted to attach Keras sub-network to existing tensorflow. The reason is that tf.global_variables_initializer() resets weights in loaded Keras network. I wanted to use its outputs as a part of loss. A workaround I found for that:
sess.run(tf.global)# to keep everything on one graph.
# May be redundant. I had errors for tensors being on different graphs
tf.reset_default_graph()
with tf.Session(config=config) as sess:
# load placeholders, tf variables, tf network, etc
optimizer = tf.train.AdamOptimizer(learning_rate=learning_rate, beta1=0.9)
init = tf.global_variables_initializer() # This reinitializes keras weights, so must be put before Keras loading
sess.run(init)
keras.backend.set_session(sess)
kerr_net = LoadKerasNetwork() # keras.models.load_model or construct it and do load_weights()
# For example, in order to input tensor to trained keras network, you can just do
# ...
# model = Model(inputs=self.base_model.input, outputs=self.logits)
# model.load_weights(model_weights_path)
# model.trainable = False
# output = model(input_tensor) # this will predict on tensor and return tensor!
# now I want it to be part of the loss
l1_loss = tf.losses.absolute_difference(network, target_image)
kerr_loss = kerr_net(network)
loss = loss = l1_loss + 0.1*kerr_loss
train_op = optimizer.minimize(loss, global_step=global_step)
# initialize only optimizer here, in order not to reset keras
sess.run(tf.variables_initializer(optimizer.variables()))
# here you go
loss_current, summary = sess.run([loss, images_summary], feed_dict={input_image: x, target_image: y})
Thanks a lot for sharing!
Most helpful comment
@makora9143 resolved the issue, and now i am facing issue with keras model.predit(). Its returning the same values.