TensorFlow installed from binary (CPU):
TensorFlow version (use command below):
Hi:
I am trying to export ssdlite_Mobilenet_v2 model to tf-lite:
I downloaded from this path
I tried with these two ways:
I use export_tflite_ssd_graph.py like this:
``` javascript python export CONFIG_FILE=/home/VICOMTECH/uelordi/projects/tflite_models/ssdlite_mobilenet_v2_coco_2018_05_09/pipeline.config
export CHECKPOINT_PATH=/home/VICOMTECH/uelordi/projects/tflite_models/ssdlite_mobilenet_v2_coco_2018_05_09/model.ckpt
export OUTPUT_DIR=/home/VICOMTECH/uelordi/projects/tflite_models/ssdlite_mobilenet_v2_coco_2018_05_09/tflite
python /home/VICOMTECH/uelordi/SDK/tensorflow1/models/models/research/object_detection /export_tflite_ssd_graph.py \
--pipeline_config_path=$CONFIG_FILE \
--trained_checkpoint_prefix=$CHECKPOINT_PATH \
--output_directory=$OUTPUT_DIR \
--add_postprocessing_op=true
```
toco \
--input_file=tflite_graph.pb \
--output_file=latest_ssdlite_mobilenetv2.tflite \
--input_format=TENSORFLOW_GRAPHDEF \
--input_shapes=1,300,300,3 \
--output_format=TFLITE \
--input_arrays=normalized_input_image_tensor \
--output_arrays='TFLite_Detection_PostProcess','TFLite_Detection_PostProcess:1','TFLite_Detection_PostProcess:2','TFLite_Detection_PostProcess:3' \
--inference_type=FLOAT \
--mean_values=128 \
--std_values=128 \
--change_concat_input_ranges=false \
and I got this errors:
Some of the operators in the model are not supported by the standard TensorFlow Lite runtime. If you have a custom implementation for them you can disable this error with --allow_custom_ops. Here is a list of operators for which you will need custom implementations: DIV, Squeeze, TFLite_Detection_PostProcess.
So I added allow-custom-ops, and when I use the tf-lite interperter in android:
d.tfLite = new Interpreter(loadModelFile(assetManager, modelFilename));
And as I supposed I have custom operation error.
Internal error: Cannot create interpreter: Didn't find custom op for name 'DIV' with version 1
Didn't find custom op for name 'Squeeze' with version 1
Registration failed.
my toco script is:
toco \
--input_file=frozen_inference_graph.pb \
--output_file=ssd_lite_v2.tflite \
--input_format=TENSORFLOW_GRAPHDEF \
--input_shapes=1,300,300,3 \
--output_format=TFLITE \
--input_arrays=normalized_input_image_tensor \
--output_arrays='detection_boxes,detection_scores,detection_classes,num_detections' \
--inference_type=FLOAT \
--mean_values=128 \
--std_values=128 \
--change_concat_input_ranges=false \
As I have some errors with the operators I added allow-custom-ops, and when I use the tf-lite interperter in android:
d.tfLite = new Interpreter(loadModelFile(assetManager, modelFilename));
But as I supposed I have custom operation error.
Internal error: Cannot create interpreter: Didn't find custom op for name 'DIV' with version 1
Didn't find custom op for name 'Squeeze' with version 1
Registration failed.
So my questions are:
I have a similar problem!
2018-08-07 13:21:07.345473: I tensorflow/contrib/lite/toco/import_tensorflow.cc:1053] Converting unsupported operation: SquaredDifference
2018-08-07 13:21:07.361241: I tensorflow/contrib/lite/toco/import_tensorflow.cc:1053] Converting unsupported operation: TFLite_Detection_PostProcess
same problem in Android. I've tried first variant but have an error like "Cannot create interpreter: Didn't find custom op for name 'TFLite_Detection_PostProcess'". Then I paid attention that there were some notifications during toco
converting, like "Converting unsupported operation: TFLite_Detection_PostProcess". But tflite file was created anyway. Then I've tried to generate tflite without any warnings, and I just can't. I also have tried your second variant, but it causes error " Output array not found: detection_boxes". Well, at least it doesn't generate invalid file
P.S. I used ssd_mobilenet_v1_coco
pretrained model
If the tflite file was created, it is valid. Does the file run in the app?
FYI, the warning converting unsupported operation is there because the postprocessing op is a custom TFLite op (not the regular Tensorflow op)
@achowdhery the file does not run due to " Cannot create interpreter: Didn't find custom op for name 'TFLite_Detection_PostProcess'" exception
@kate-kate I think this is a version issue where you dont have the correct version of Tensorflow which has the op available. Please provide details on how you installed Tensorflow. Is this from within ML Kit?
@achowdhery I installed tensorflow from source just like it is described in the tensorflow site
Here is tf_env_collect.sh
result, maybe it will help
== cat /etc/issue ===============================================
Darwin MacBook-Pro-Ekaterina.local 17.7.0 Darwin Kernel Version 17.7.0: Thu Jun 21 22:53:14 PDT 2018; root:xnu-4570.71.2~1/RELEASE_X86_64 x86_64
Mac OS X 10.13.6
== are we in docker =============================================
No
== compiler =====================================================
Apple LLVM version 9.1.0 (clang-902.0.39.2)
Target: x86_64-apple-darwin17.7.0
Thread model: posix
InstalledDir: /Library/Developer/CommandLineTools/usr/bin
== uname -a =====================================================
Darwin MacBook-Pro-Ekaterina.local 17.7.0 Darwin Kernel Version 17.7.0: Thu Jun 21 22:53:14 PDT 2018; root:xnu-4570.71.2~1/RELEASE_X86_64 x86_64
== check pips ===================================================
numpy 1.14.5
numpydoc 0.7.0
protobuf 3.6.0
tensorflow 1.10.0
== check for virtualenv =========================================
False
== tensorflow import ============================================
tf.VERSION = 1.10.0
tf.GIT_VERSION = v1.10.0-rc1-19-g656e7a2b34
tf.COMPILER_VERSION = v1.10.0-rc1-19-g656e7a2b34
Sanity check: array([1], dtype=int32)
== env ==========================================================
LD_LIBRARY_PATH is unset
DYLD_LIBRARY_PATH is unset
== nvidia-smi ===================================================
tf_env_collect.sh: line 105: nvidia-smi: command not found
== cuda libs ===================================================
I am having a similar issue here but my tflite model works on Android demo only that it returns many false positive detections.
I am not sure if this toco log suggests the problem
tensorflow/lite/toco/import_tensorflow.cc:1332] Converting unsupported operation: TFLite_Detection_PostProcess
@achowdhery any update regarding this issue ? i am facing the same problem when converting a custom trained model to tflite .
I am facing the same problem did anyone fix?
I am having same issues any updates ??
Most helpful comment
@achowdhery any update regarding this issue ? i am facing the same problem when converting a custom trained model to tflite .