Models: Converting SSD MobileNet v2 320x320 From Saved Model to TFLite - tensorflow.lite.python.convert.ConverterError: requires all operands and results to have compatible element types

Created on 1 Sep 2020  路  4Comments  路  Source: tensorflow/models

Prerequisites

Please answer the following questions for yourself before submitting an issue.

  • [x] I am using the latest TensorFlow Model Garden release and TensorFlow 2.
  • [x] I am reporting the issue to the correct repository. (Model Garden official or research directory)
  • [x] I checked to make sure that this issue has not already been filed.

1. The entire URL of the file you are using

Python file below which is based upon https://www.tensorflow.org/lite/convert/python_api#converting_a_savedmodel_

2. Describe the bug

When I am trying to convert the SSD MobileNet v2 320x320 from saved_model to a TFLite file, it gives me an error when calling converter.convert() (Python file under the Steps to Reproduce section) ConverterError: <unknown>:0: error: loc("Func/StatefulPartitionedCall/input/_0"): requires all operands and results to have compatible element types

raceback (most recent call last):
  File "/Library/Frameworks/Python.framework/Versions/3.8/lib/python3.8/site-packages/tensorflow/lite/python/convert.py", line 196, in toco_convert_protos
    model_str = wrap_toco.wrapped_toco_convert(model_flags_str,
  File "/Library/Frameworks/Python.framework/Versions/3.8/lib/python3.8/site-packages/tensorflow/lite/python/wrap_toco.py", line 32, in wrapped_toco_convert
    return _pywrap_toco_api.TocoConvert(
Exception: <unknown>:0: error: loc("Func/StatefulPartitionedCall/input/_0"): requires all operands and results to have compatible element types
<unknown>:0: note: loc("Func/StatefulPartitionedCall/input/_0"): see current operation: %1 = "tf.Identity"(%arg0) {device = ""} : (tensor<1x?x?x3x!tf.quint8>) -> tensor<1x?x?x3xui8>


During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/Users/Data/TFOD/tf2convertTest.py", line 7, in <module>
    tflite_quant_model = converter.convert()
  File "/Library/Frameworks/Python.framework/Versions/3.8/lib/python3.8/site-packages/tensorflow/lite/python/lite.py", line 1076, in convert
    return super(TFLiteConverterV2, self).convert()
  File "/Library/Frameworks/Python.framework/Versions/3.8/lib/python3.8/site-packages/tensorflow/lite/python/lite.py", line 899, in convert
    return super(TFLiteFrozenGraphConverterV2,
  File "/Library/Frameworks/Python.framework/Versions/3.8/lib/python3.8/site-packages/tensorflow/lite/python/lite.py", line 629, in convert
    result = _toco_convert_impl(
  File "/Library/Frameworks/Python.framework/Versions/3.8/lib/python3.8/site-packages/tensorflow/lite/python/convert.py", line 569, in toco_convert_impl
    data = toco_convert_protos(
  File "/Library/Frameworks/Python.framework/Versions/3.8/lib/python3.8/site-packages/tensorflow/lite/python/convert.py", line 202, in toco_convert_protos
    raise ConverterError(str(e))
tensorflow.lite.python.convert.ConverterError: <unknown>:0: error: loc("Func/StatefulPartitionedCall/input/_0"): requires all operands and results to have compatible element types
<unknown>:0: note: loc("Func/StatefulPartitionedCall/input/_0"): see current operation: %1 = "tf.Identity"(%arg0) {device = ""} : (tensor<1x?x?x3x!tf.quint8>) -> tensor<1x?x?x3xui8>

3. Steps to reproduce

I downloaded the SSD MobileNet v2 320x320 model from the TensorFlow Object Detection 2 API and wanted to test converting just the base model to a TFLite model, but when I run the following Python script, it gives the error that it requires all operands and results to have compatible element types.

Here is the Python File:

import tensorflow as tf

converter = tf.lite.TFLiteConverter.from_saved_model('ssd_mobilenet_v2_320x320_coco17_tpu-8/saved_model')
converter.optimizations = [tf.lite.Optimize.DEFAULT]
converter.experimental_new_converter = True
converter.target_spec.supported_ops = [tf.lite.OpsSet.TFLITE_BUILTINS, tf.lite.OpsSet.SELECT_TF_OPS]
tflite_model = converter.convert()
with tf.io.gfile.GFile('model.tflite', 'wb') as f:
  f.write(tflite_model)

4. Expected behavior

The base model should be able to convert to a TFLite without error and I have also tested with a custom trained model and both give the same error when using this same exact file to convert.

5. System information

  • OS Platform and Distribution (e.g., Linux Ubuntu 16.04): Tested on both Windows 10 and MacOS 10.15.6
  • Mobile device name if the issue happens on a mobile device: None
  • TensorFlow installed from (source or binary): binary
  • TensorFlow version (use command below): 2.3.0
  • Python version: 3.8.5
  • Bazel version (if compiling from source): None
  • GCC/Compiler version (if compiling from source): None
  • CUDA/cuDNN version: None
  • GPU model and memory: None
research bug

All 4 comments

Hi @sajjadaemmi

Thanks for the info, looks like @srjoglekar246 is working on releasing a script to convert TF2 models to TFLite on the issue you had mentioned. I will watch that thread so I can use that script soon too!

@mihir-chauhan

This issue is duplicate of #9033. Can we close the issue here and track the issue in #9033 . It will help us to follow easily. Please let us know. Thanks!

@ravikyram

Sure, we can close this issue. I just want to make sure that the script by @srjoglekar246 will fix this issue and will work on an Android phone without getting the java.lang.IllegalArgumentException: ByteBuffer is not a valid flatbuffer model error?

Was this page helpful?
0 / 5 - 0 ratings

Related issues

Mostafaghelich picture Mostafaghelich  路  3Comments

25b3nk picture 25b3nk  路  3Comments

licaoyuan123 picture licaoyuan123  路  3Comments

sun9700 picture sun9700  路  3Comments

frankkloster picture frankkloster  路  3Comments