I'm trying to convert frozen graph to json file. I use this command:
tensorflowjs_converter --input_format=tf_frozen_model --output_node_names="SemanticPredictions" --saved_model_tags=serve frozen_inference_graph.pb mymodal
But it gives this error:
Traceback (most recent call last):
File "d:\programdata\anaconda3\envs\tensorflow0\lib\runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "d:\programdata\anaconda3\envs\tensorflow0\lib\runpy.py", line 85, in _run_code
exec(code, run_globals)
File "D:\ProgramData\Anaconda3\envs\tensorflow0\Scripts\tensorflowjs_converter.exe\__main__.py", line 7, in <module>
File "d:\programdata\anaconda3\envs\tensorflow0\lib\site-packages\tensorflowjs\converters\converter.py", line 645, in pip_main
main([' '.join(sys.argv[1:])])
File "d:\programdata\anaconda3\envs\tensorflow0\lib\site-packages\tensorflowjs\converters\converter.py", line 649, in main
convert(argv[0].split(' '))
File "d:\programdata\anaconda3\envs\tensorflow0\lib\site-packages\tensorflowjs\converters\converter.py", line 632, in convert
strip_debug_ops=args.strip_debug_ops)
File "d:\programdata\anaconda3\envs\tensorflow0\lib\site-packages\tensorflowjs\converters\tf_saved_model_conversion_v2.py", line 379, in convert_tf_frozen_model
strip_debug_ops=strip_debug_ops)
File "d:\programdata\anaconda3\envs\tensorflow0\lib\site-packages\tensorflowjs\converters\tf_saved_model_conversion_v2.py", line 133, in optimize_graph
graph.add_to_collection('train_op', graph.get_operation_by_name(name))
File "d:\programdata\anaconda3\envs\tensorflow0\lib\site-packages\tensorflow_core\python\framework\ops.py", line 3633, in get_operation_by_name
return self.as_graph_element(name, allow_tensor=False, allow_operation=True)
File "d:\programdata\anaconda3\envs\tensorflow0\lib\site-packages\tensorflow_core\python\framework\ops.py", line 3505, in as_graph_element
return self._as_graph_element_locked(obj, allow_tensor, allow_operation)
File "d:\programdata\anaconda3\envs\tensorflow0\lib\site-packages\tensorflow_core\python\framework\ops.py", line 3565, in _as_graph_element_locked
"graph." % repr(name))
KeyError: "The name 'SemanticPredictions' refers to an Operation not in the graph."
I don't why it gives KeyError: "The name 'SemanticPredictions' refers to an Operation not in the graph."error.
The error means the output name is not correct ,you need to correct the output names for the graph , thank you.
The error means the output name is not correct ,you need to correct the output names for the graph , thank you.
Thanks for your response.
And how can I get the output names? I've a large custom trained frozen graph. I've tried Netron but couldn't decide which link should I get.
Can you please try this solution and let us know if it is works.
@sundowatch if you have a frozen graph, you can also try the summarize_graph tool from here
Can you please try this solution and let us know if it is works.
@rthadur That returns output names, but which one?
@sundowatch if you have a frozen graph, you can also try the summarize_graph tool from here
@dozeBoy Yes, I have frozen graph. But is there any Python solution for this tool. Otherwise I need to switch to C++
@sundowatch it's a CLI tool. You need to build it first with Bazel and then you can use it within a terminal.
@sundowatch i can't say exactly what you will need to do since i am not familiar with the model.
@rthadur I've trained my own model
Ok, I've converted saved_model with this code:
tensorflowjs_converter --input_format=tf_saved_model --output_format=tfjs_graph_model --signature_name=serving_default --saved_model_tags=serve saved_model saved_model/custom_model
But now when I load this onto page it gives error like this:
Uncaught (in promise) Error: layer: Improper config format: {"node": ...........................
I don't know what have I done wrong.
Ok my mistake. I was usingloadLayersModel() instead of loadGraphModel()
Most helpful comment
Ok my mistake. I was using
loadLayersModel()instead ofloadGraphModel()