Please answer the following questions for yourself before submitting an issue.
Python file below which is based upon https://www.tensorflow.org/lite/convert/python_api#converting_a_savedmodel_
When I am trying to convert the SSD MobileNet v2 320x320 from saved_model to a TFLite file, it gives me an error when calling converter.convert() (Python file under the Steps to Reproduce section) ConverterError: <unknown>:0: error: loc("Func/StatefulPartitionedCall/input/_0"): requires all operands and results to have compatible element types
raceback (most recent call last):
File "/Library/Frameworks/Python.framework/Versions/3.8/lib/python3.8/site-packages/tensorflow/lite/python/convert.py", line 196, in toco_convert_protos
model_str = wrap_toco.wrapped_toco_convert(model_flags_str,
File "/Library/Frameworks/Python.framework/Versions/3.8/lib/python3.8/site-packages/tensorflow/lite/python/wrap_toco.py", line 32, in wrapped_toco_convert
return _pywrap_toco_api.TocoConvert(
Exception: <unknown>:0: error: loc("Func/StatefulPartitionedCall/input/_0"): requires all operands and results to have compatible element types
<unknown>:0: note: loc("Func/StatefulPartitionedCall/input/_0"): see current operation: %1 = "tf.Identity"(%arg0) {device = ""} : (tensor<1x?x?x3x!tf.quint8>) -> tensor<1x?x?x3xui8>
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/Users/Data/TFOD/tf2convertTest.py", line 7, in <module>
tflite_quant_model = converter.convert()
File "/Library/Frameworks/Python.framework/Versions/3.8/lib/python3.8/site-packages/tensorflow/lite/python/lite.py", line 1076, in convert
return super(TFLiteConverterV2, self).convert()
File "/Library/Frameworks/Python.framework/Versions/3.8/lib/python3.8/site-packages/tensorflow/lite/python/lite.py", line 899, in convert
return super(TFLiteFrozenGraphConverterV2,
File "/Library/Frameworks/Python.framework/Versions/3.8/lib/python3.8/site-packages/tensorflow/lite/python/lite.py", line 629, in convert
result = _toco_convert_impl(
File "/Library/Frameworks/Python.framework/Versions/3.8/lib/python3.8/site-packages/tensorflow/lite/python/convert.py", line 569, in toco_convert_impl
data = toco_convert_protos(
File "/Library/Frameworks/Python.framework/Versions/3.8/lib/python3.8/site-packages/tensorflow/lite/python/convert.py", line 202, in toco_convert_protos
raise ConverterError(str(e))
tensorflow.lite.python.convert.ConverterError: <unknown>:0: error: loc("Func/StatefulPartitionedCall/input/_0"): requires all operands and results to have compatible element types
<unknown>:0: note: loc("Func/StatefulPartitionedCall/input/_0"): see current operation: %1 = "tf.Identity"(%arg0) {device = ""} : (tensor<1x?x?x3x!tf.quint8>) -> tensor<1x?x?x3xui8>
I downloaded the SSD MobileNet v2 320x320 model from the TensorFlow Object Detection 2 API and wanted to test converting just the base model to a TFLite model, but when I run the following Python script, it gives the error that it requires all operands and results to have compatible element types.
Here is the Python File:
import tensorflow as tf
converter = tf.lite.TFLiteConverter.from_saved_model('ssd_mobilenet_v2_320x320_coco17_tpu-8/saved_model')
converter.optimizations = [tf.lite.Optimize.DEFAULT]
converter.experimental_new_converter = True
converter.target_spec.supported_ops = [tf.lite.OpsSet.TFLITE_BUILTINS, tf.lite.OpsSet.SELECT_TF_OPS]
tflite_model = converter.convert()
with tf.io.gfile.GFile('model.tflite', 'wb') as f:
f.write(tflite_model)
The base model should be able to convert to a TFLite without error and I have also tested with a custom trained model and both give the same error when using this same exact file to convert.
it is duplicate to https://github.com/tensorflow/models/issues/9033
Hi @sajjadaemmi
Thanks for the info, looks like @srjoglekar246 is working on releasing a script to convert TF2 models to TFLite on the issue you had mentioned. I will watch that thread so I can use that script soon too!
@mihir-chauhan
This issue is duplicate of #9033. Can we close the issue here and track the issue in #9033 . It will help us to follow easily. Please let us know. Thanks!
@ravikyram
Sure, we can close this issue. I just want to make sure that the script by @srjoglekar246 will fix this issue and will work on an Android phone without getting the java.lang.IllegalArgumentException: ByteBuffer is not a valid flatbuffer model error?