Models: [Feature request] Example of TensorFlow Lite C++ and MobileNet SSD for Object Detection

Created on 17 May 2018  路  16Comments  路  Source: tensorflow/models

System information

  • What is the top-level directory of the model you are using: Master
  • Have I written custom code (as opposed to using a stock example script provided in TensorFlow): N/A
  • OS Platform and Distribution (e.g., Linux Ubuntu 16.04): N/A
  • TensorFlow installed from (source or binary): N/A
  • TensorFlow version (use command below): N/A
  • Bazel version (if compiling from source): N/A
  • CUDA/cuDNN version: N/A
  • GPU model and memory: N/A
  • Exact command to reproduce: N/A

There is already an example for Java, https://github.com/tensorflow/tensorflow/blob/master/tensorflow/contrib/lite/examples/android/src/org/tensorflow/demo/TFLiteObjectDetectionAPIModel.java .

However, it is not easy to adapt it to C++ withouth documentation, specially parsing the outputs.

.......
(fill inputs)
.......

intepreter->Invoke();
const std::vector<int>& results = interpreter->outputs();
TfLiteTensor* outputLocations = interpreter->tensor(results[0]);
TfLiteTensor* outputClasses   = interpreter->tensor(results[1]);
float *data = tflite::GetTensorData<float>(outputClasses);
for(int i=0;i<NUM_RESULTS;i++)
{
   for(int j=1;j<NUM_CLASSES;j++)
   {
      float score = expit(data[i*NUM_CLASSES+j]); // How has this to be done?
    }
}

Most helpful comment

@achowdhery Why did you close this? This issue is about having a C++ API example for doing inference with tensorflow. Nowhere in medium post you linked is that discussed (most of it is about training! completely unrelated).

All 16 comments

@JaviBonilla any new findings?

@davidfant, not really. I switched to TensorFlow from TensorFlow Lite, because as @YijinLiu mentioned, the SSD TensorFlow Lite example generates too much noise and loses some good detections.

But, you can test it, just check this repository (https://github.com/YijinLiu/tf-cpu). In particular, you can find how to get the outputs in the _tf-cpu/benchmark/obj_detect_lite.cc_ file, _AnnotateMat()_ function, which is executed after the _Interpreter->Invoke()._

The only missing part is to get the coordinates of the bounding boxes, but you can calculate them as follows:

1.- Load the box prior file given in the example.
2.- Implement the _decodeCenterSizeBoxes_ in C++ from the Java version given in https://github.com/tensorflow/tensorflow/blob/master/tensorflow/contrib/lite/examples/android/src/org/tensorflow/demo/TFLiteObjectDetectionAPIModel.java. Where the _predictions_ argument is the _output_locations_ variable.

Let me know if that helps.

Thanks @achowdhery, great post! Are the results obtained from _TensorFlow Lite_ the same than those obtained from _TensorFlow_? Because many layers were removed from the _pb_ graph when converted to _tflite_ format in a previous TensorFlow Lite example for object detection, and that caused that the results were not as good as those from _TensorFlow_.

@JaviBonilla The accuracy of quantized models is within 1% of the float models. No layers are removed in the model. What are the symptoms of results not being as good?

@achowdhery I did not mean there is an issue with quantized models, but I experienced a huge difference between the results given by _Tensorflow_ and _Tensorflow Lite_ when using SSD MobileNet models. This was discussed here by @YijinLiu.

This happened last time I check, perhaps latest version already solved the problem. I can prepare the same example for both and compare the results to show the discrepancies.

invoke() it takes a lot of time. Is there a way to optimize for video stream detection?

@davidfant, not really. I switched to TensorFlow from TensorFlow Lite, because as @YijinLiu mentioned, the SSD TensorFlow Lite example generates too much noise and loses some good detections.

But, you can test it, just check this repository (https://github.com/YijinLiu/tf-cpu). In particular, you can find how to get the outputs in the _tf-cpu/benchmark/obj_detect_lite.cc_ file, _AnnotateMat()_ function, which is executed after the _Interpreter->Invoke()._

The only missing part is to get the coordinates of the bounding boxes, but you can calculate them as follows:

1.- Load the box prior file given in the example.
2.- Implement the _decodeCenterSizeBoxes_ in C++ from the Java version given in https://github.com/tensorflow/tensorflow/blob/master/tensorflow/contrib/lite/examples/android/src/org/tensorflow/demo/TFLiteObjectDetectionAPIModel.java. Where the _predictions_ argument is the _output_locations_ variable.

Let me know if that helps.

Thanks for your reference! It's awesome!

@achowdhery Why did you close this? This issue is about having a C++ API example for doing inference with tensorflow. Nowhere in medium post you linked is that discussed (most of it is about training! completely unrelated).

@RuABraun I don't know if there are simpler examples in the TensorFlow Lite repository, but I wrote some tutorials about apps using TensorFlow Lite C++ API for object detection (MobileNet SSD).

Those examples are open source and are hosted on github. They make use of Qt/QML for the GUI. Felgo is also used to easily deploy Qt apps to mobile devices.

There is also another example by @YijinLiu using OpenCV at https://github.com/YijinLiu/tf-cpu/blob/master/benchmark/obj_detect_lite.cc

I hope they help.

@JaviBonilla

Hi Javi,

Actually i'm trying object detection to run on qualcomm board. For that i need c++ wrapper which i can read the tflite model and can infer. The link you provided to @YijinLiu github looks good but doesn't have any documentation like from where to start and how to run the code, env setup and all.

Can you please help me with those so that i can try on my trained .tflite model? It'll be really helpful.

Hi there, I built a mobilenet-ssd tflite c++ demo, which can run on x86 and arm64 linux ubuntu, here is the repo https://github.com/finnickniu/tensorflow_object_detection_tflite.
Cheers.

@JaviBonilla

Hi Javi,

Actually i'm trying object detection to run on qualcomm board. For that i need c++ wrapper which i can read the tflite model and can infer. The link you provided to @YijinLiu github looks good but doesn't have any documentation like from where to start and how to run the code, env setup and all.

Can you please help me with those so that i can try on my trained .tflite model? It'll be really helpful.

Hi @shauryad15,

Sorry I missed you message. Yes, there isn't any documentation, but in obj_detect_lite.cc, it is the main() function, you can begin there. You can also have a look at the Makefile to see how the software is compiled.

I think you could try to download the repository, study it a little bit, build it, and then try to adapt the code for your own model.

@shauryad15,

Also, you can have a look at @finnickniu repository, there is documentation about the building process.

If anyone is interested in object detection on iOS and Android using c++ I created this demo
And these blog posts

@ValYouW thanks yuval. Also it would be great if you explain conversion process. I've seen many people getting stuck there. Including me sometimes. :)

Was this page helpful?
0 / 5 - 0 ratings