Dear,
Recently, i was migrating to tensorflow object detetction API version 2.0 from 1.15.0, Able to train & Evaluate the model.
But while creating a frozen model using export_inference_graph.py getting below error-
Traceback (most recent call last): File "export_inference_graph.py", line 206, in
Fixed the Raised issue by disabling eager execution , Now getting new error.
File "export_inference_graph.py", line 206, in
tf.app.run()
File "/usr/local/lib/python3.6/dist-packages/tensorflow/python/platform/app.py", line 40, in run
_run(main=main, argv=argv, flags_parser=_parse_flags_tolerate_undef)
File "/usr/local/lib/python3.6/dist-packages/absl/app.py", line 299, in run
_run_main(main, args)
File "/usr/local/lib/python3.6/dist-packages/absl/app.py", line 250, in _run_main
sys.exit(main(argv))
File "export_inference_graph.py", line 202, in main
side_input_types=side_input_types)
File "/usr/local/lib/python3.6/dist-packages/object_detection/exporter.py", line 625, in export_inference_graph
side_input_types=side_input_types)
File "/usr/local/lib/python3.6/dist-packages/object_detection/exporter.py", line 514, in _export_inference_graph
profile_inference_graph(tf.get_default_graph())
File "/usr/local/lib/python3.6/dist-packages/object_detection/exporter.py", line 642, in profile_inference_graph
contrib_tfprof.model_analyzer.TRAINABLE_VARS_PARAMS_STAT_OPTIONS)
NameError: name 'contrib_tfprof' is not defined
and did some research and come to know contrib module is not supported by tensorflow 2.0.
Actually, i am stuck now how can i save my trained object detection model on 2.0 and use it for prediction...Please help...!!!!
Can you solved it? I'm stuck at same point now!
Hi, Can I ask which binary are you using? For tf2, model_main_tf2.py should be used. Thanks!
The error is raised when you run export_inference_graph.py file.
Issue also present in export_tflite_ssd_graph.py/export_tflite_ssd_graph_lib.py
Disabling eager execution in export_tflite_ssd_graph.py results in a similar error:
File "/home/mate/venv/tensorflow2.2/lib/python3.7/site-packages/object_detection/exporter.py", line 145, in rewrite_nn_resize_op
while remove_nn():
File "/home/mate/venv/tensorflow2.2/lib/python3.7/site-packages/object_detection/exporter.py", line 100, in remove_nn
input_pattern = graph_matcher.OpTypePattern(
NameError: name 'graph_matcher' is not defined
placeholder and frozengraph are all TF1 features. In TF2 you should use SavedModel. See exporter_main_v2.py
placeholder and frozengraph are all TF1 features. In TF2 you should use SavedModel. See exporter_main_v2.py
I would like to see this info on this page under guides.
https://github.com/tensorflow/models/blob/master/research/object_detection/g3doc/tf2.md
I have the same issue with export_inference_graph.py. Model trained usign model_main_tf2.py.
When I disable eager execution, the new error is same as the one described by the original author. (NameError: name 'contrib_tfprof' is not defined)
I tried using exporter_main_v2.py , this raises an assertion error.
I also came up with an error after using exporter_main_v2.py. Pls, See the issue on the link.
https://github.com/tensorflow/models/issues/8886#issuecomment-659921958
use exporter_main_v2.py script only, for exporting the models. By default it exports recent checkpoint present in checkpoint_dir. If you want it to export a particular checkpoint then change model_checkpoint_path in checkpoint file.
I have the same issue with export_inference_graph.py. Model trained usign model_main_tf2.py.
When I disable eager execution, the new error is same as the one described by the original author. (NameError: name 'contrib_tfprof' is not defined)I tried using exporter_main_v2.py , this raises an assertion error.
I had same error.
When we export the model we need to specify path of directory of chekpoint not path of chekpoint in trained_checkpoint_dir in tfv2
python .\exporter_main_v2.py --input_type image_tensor --pipeline_config_path .\models\my_efficientdet_d1\pipeline.config --trained_checkpoint_dir .\models\my_efficientdet_d1\ --output_directory .\exported-models\my_model
This still does not give me model in output_directory. I am unable to find reason why ?
i came across the same error when i was using this file but when i use exporter_main_v2.py it seems okay
if only the exporter_main_v2.py should be used with tf2, why are they not making an official statement on their webpage? beside that, does it means it won't be possible to get an inference frozen graph pb file with tensorflow2 anymore?
if so how do they get the frozengraph on the model zoo with tf2?
did anyone find something on this issue? I'm also stock at the same point with xport_inference_graph.py
Most helpful comment
placeholder and frozengraph are all TF1 features. In TF2 you should use SavedModel. See exporter_main_v2.py