Hey guys I am having problem using keras interchangeably with tensorflow API can you please point me to where I am not understanding correctly model construction.
# IMAGENET PRETRAINED VGG16
base_model = VGG16(weights='imagenet', include_top=True)
# KERAS LAYER DEPENDING ON PARAMS
vgg_out = base_model.get_layer(name='fc2') if include_fc else base_model.get_layer(name='block5_pool')
# GET FLATTENED TENSORFLOW TENSOR
flatten = Flatten()(vgg_out.output)
# DO WE WANT TO BACKPROP GRADS TO PRETRAINED LAYERS?
out = flatten if finetune_extractor else tf.stop_gradient(flatten, 'stop_grad')
# HASHING LAYERS
hash_layer = Dense(hash_size, activation='relu')(out)
# CREATE KERAS MODEL
return Model(inputs=base_model.input, outputs=hash_layer, name="hash_output")
The error I am having is
Traceback (most recent call last):
File "/home/*/*/*/main.py", line 24, in <module>
model = UTH(64, finetune_extractor=False, include_fc=False, input_shape=(None, 32, 32, 3), margin=1)
File "/home/*/*/*/UnsupervisedTripleHashing/model.py", line 11, in __init__
hash_net = self.inference(hash_size, finetune_extractor, include_fc)
File "/home/*/*/*/UnsupervisedTripleHashing/model.py", line 64, in inference
return Model(inputs=base_model.input, outputs=hash_layer, name="hash_output")
File "/home/kovacs/tf-venv3/lib/python3.5/site-packages/tensorflow/contrib/keras/python/keras/engine/topology.py", line 1643, in __init__
build_map_of_graph(x, seen_nodes, depth=0)
File "/home/kovacs/tf-venv3/lib/python3.5/site-packages/tensorflow/contrib/keras/python/keras/engine/topology.py", line 1634, in build_map_of_graph
next_node = layer.inbound_nodes[node_index]
AttributeError: 'NoneType' object has no attribute 'inbound_nodes'
Exception ignored in: <bound method BaseSession.__del__ of <tensorflow.python.client.session.Session object at 0x7f49d8526518>>
Traceback (most recent call last):
File "/home/kovacs/tf-venv3/lib/python3.5/site-packages/tensorflow/python/client/session.py", line 587, in __del__
AttributeError: 'NoneType' object has no attribute 'TF_NewStatus'
wrap tf.stop_gradient in a lambda layer
I fixed the Exception ignored in:
import keras.backend as K
# get data, train model
model.fit_generator(...)
# clear the session manually
K.clear_session()
This fixed the error in my case. Apparently model.fit did not end the session cleanly.
Source for fix:
http://qiita.com/TomokIshii/items/178938b6db1edc16b94e
setup:
tensorflow - 1.1.0
keras - 2.0.4
This issue has been automatically marked as stale because it has not had recent activity. It will be closed after 30 days if no further activity occurs, but feel free to re-open a closed issue if needed.
Most helpful comment
I fixed the Exception ignored in:> , I do this by adding :
import keras.backend as K# get data, train modelmodel.fit_generator(...)# clear the session manuallyK.clear_session()This fixed the error in my case. Apparently model.fit did not end the session cleanly.
Source for fix:
http://qiita.com/TomokIshii/items/178938b6db1edc16b94e
setup:
tensorflow - 1.1.0
keras - 2.0.4