Not sure if it's a bug or not, but I am gonna mark it this way for now.
After downloading the pre-trained model of Centernet resnet50 V1 FPN Keypoints: http://download.tensorflow.org/models/object_detection/tf2/20200711/centernet_resnet50_v1_fpn_512x512_kpts_coco17_tpu-8.tar.gz, I tried to convert to tflite but I couldn't do that. I did some research and ran into a filed issue: https://github.com/tensorflow/tensorflow/issues/43495, that explained that I should re train the model and use tf-nightly to convert tflite model. After I got the tflite model I tried interpreting it with Interpreter interface for TensorFlow Lite Models (tf.lite.Interpreter(tflite_model)) but it kept failing and the error that I am getting is:
ValueError: Did not get operators or tensors in subgraph 1
import tensorflow as tf
interpreter = tf.lite.Interpreter(path_to_tflite_model)
interpreter.allocate_tensors()
interpreter.invoke()
I was expecting code posted above not to fail.
I tried tflite model in Netron app works fine so it's not corrupted
I tried that model with C++ API of current master and could reproduce it, I got this output when tried to build from FlatBufferModel::BuildFromFile
INFO: Initialized TensorFlow Lite runtime.
ERROR: Did not get operators or tensors in subgraph 1.
The Exporting script does not support CenterNet as of now, but we are looking into it.
Will update this bug when it lands.
@srjoglekar246 I have already exported to tflite CenterNet model, but I am having troubles to interpret it.
I am getting: ValueError: Did not get operators or tensors in subgraph 1
The architecture doesn't convert to TFLite as expected, so your exported model is not runnable
@srjoglekar246 Thanks a lot for your answer! Can I ask If you have any idea when the fix will be released?
We are working with the Research team to add support in our conversion tooling, and also train some new models that are mobile-friendly (smaller dimensions, MobileNet backbone instead of ResNet, etc). There was significant work involved in re-writing some parts of the model to make it convertible to TFLite.
Should be landing in a month or two. Sorry about the delay!
I am getting the same bug when I try to convert my own machine translation model using TensorFlow 2.3.1 to convert to TFLite:
Internal error: Cannot create interpreter: Did not get operators or tensors in subgraph 3.
Is there any way to find out which operation in the model is causing this so I can avoid using it?
If I try converting the model using TF-nightly 2.5.0-dev20201111, and I use tensorflow-lite:0.0.0-nightly on Android Studio to run the model, then I get this different error:
Cannot create interpreter: Op builtin_code out of range: 127. Are you using old TFLite binary with newer model?
Registration failed.
Is there anything I can try doing to fix it?
Most helpful comment
We are working with the Research team to add support in our conversion tooling, and also train some new models that are mobile-friendly (smaller dimensions, MobileNet backbone instead of ResNet, etc). There was significant work involved in re-writing some parts of the model to make it convertible to TFLite.
Should be landing in a month or two. Sorry about the delay!