Tfjs: KeyError when using tensorflowjs-converter and ssd_mobilenet_v2 from tf/models/object-detection

Created on 14 Mar 2019  路  4Comments  路  Source: tensorflow/tfjs

To get help from the community, we encourage using Stack Overflow and the tensorflow.js tag.

TensorFlow.js version

1.0.1

Browser version

We are using tfjs-node, but error occurs in tensorflowjs_converter.

Describe the problem

When using the latest tfjs-converter on a ssd-mobilenet detection model from the object detection model zoo I get a KeyError:

Traceback (most recent call last):
File "/home/julian/tensorflowjs-latest/bin/tensorflowjs_converter", line 10, in
sys.exit(main())
File "/home/julian/tensorflowjs-latest/lib/python3.6/site-packages/tensorflowjs/converters/converter.py", line 358, in main
strip_debug_ops=FLAGS.strip_debug_ops)
File "/home/julian/tensorflowjs-latest/lib/python3.6/site-packages/tensorflowjs/converters/tf_saved_model_conversion_v2.py", line 268, in convert_tf_saved_model
model = load(saved_model_dir, saved_model_tags)
File "/home/julian/tensorflowjs-latest/lib/python3.6/site-packages/tensorflow/python/saved_model/load.py", line 329, in load
root = load_v1_in_v2.load(export_dir, tags)
File "/home/julian/tensorflowjs-latest/lib/python3.6/site-packages/tensorflow/python/saved_model/load_v1_in_v2.py", line 159, in load
return loader.load(tags=tags)
File "/home/julian/tensorflowjs-latest/lib/python3.6/site-packages/tensorflow/python/saved_model/load_v1_in_v2.py", line 148, in load
signature_functions = self._extract_signatures(wrapped, meta_graph_def)
File "/home/julian/tensorflowjs-latest/lib/python3.6/site-packages/tensorflow/python/saved_model/load_v1_in_v2.py", line 107, in _extract_signatures
for name, out in signature_def.outputs.items()})
File "/home/julian/tensorflowjs-latest/lib/python3.6/site-packages/tensorflow/python/eager/wrap_function.py", line 185, in prune
[sink_tensor], pruned_graph, sources=flat_feeds + internal_captures)
File "/home/julian/tensorflowjs-latest/lib/python3.6/site-packages/tensorflow/python/eager/lift_to_graph.py", line 315, in lift_to_graph
_copy_non_source(op=op, graph=graph, op_map=op_map)
File "/home/julian/tensorflowjs-latest/lib/python3.6/site-packages/tensorflow/python/eager/lift_to_graph.py", line 152, in _copy_non_source
copied_inputs = [op_map[x] for x in op.inputs]
File "/home/julian/tensorflowjs-latest/lib/python3.6/site-packages/tensorflow/python/eager/lift_to_graph.py", line 152, in
copied_inputs = [op_map[x] for x in op.inputs]
KeyError: dtype=float32>

There is no problem when using tensorflowjs==0.8.0, but then I can't use loadGraphModel in tfjs.

Code to reproduce the bug / link to feature request

  • install tensorflowjs==1.0.1 on clean virtualenv with pip
  • download ssd_mobilenet_v2_coco_2018_03_29.tar.gz
  • cd into ssd_mobilenet_v2_coco_2018_03_29
  • run tensorflowjs_converter --input_format=tf_saved_model --output_format=tfjs_graph_model --saved_model_tags=serve ./saved_model ./web_model
converter

Most helpful comment

Any update on this?

All 4 comments

@jvdgoltz looks like the TF 2.0.alpha loader failed to load older saved model with control flow ops. While we are working on fixing this, you can still use the 0.8.0 converter, with --output_json=true flag, README:
$ tensorflowjs_converter \
--input_format=tf_saved_model \
--output_node_names='MobilenetV1/Predictions/Reshape_1' \
--saved_model_tags=serve \
--output_json=true \
/mobilenet/saved_model \
/mobilenet/web_model

This would output JSON model file that works with loadGraphModel API.

Also, bumped into this issue. Any approximate solution date?

Any update on this?

This issue has been fixed, please try again.

Was this page helpful?
0 / 5 - 0 ratings