Mask_rcnn: How to keep model loaded while running detection in a web service?

Created on 27 May 2018  Â·  16Comments  Â·  Source: matterport/Mask_RCNN

I know this might be a newbie question,
is there anyway to keep model loaded while running detection again and again?

For example, there is a API which can feed images into the detection,
the detection function runs in like 200ms but the config and model-loading run in like 2mins.
So we want to keep config and model loaded so it will save much time for the next image detection.

Any help would be greatly appreciated

Most helpful comment

@zungam I found the answer about an hour ago but thanks for the quick reply! :)
@mtcld check out my answer here for the fix (https://github.com/matterport/Mask_RCNN/issues/588). But in quick you got to do model._make_predict_function()
before you call
model.detect([image], verbose=1)

All 16 comments

model = modellib.MaskRCNN(mode="inference", config=<your_config_object>,
                                  model_dir=<your_log_dir>)
model.load_weights(<your_weight_path>, by_name=True)

#model is now loaded and kept in RAM
#do something with your model
for img in images:
       output = model.detect([img], verbose=0)[0]

I have a similar setup to what @zungam wrote, but it causes me to get
ValueError: Tensor Tensor("mrcnn_detection/Reshape_1:0", shape=(1, 100, 6), dtype=float32) is not an element of this graph.
Ever seen that?

Can you give full error stack?

I do have the same problem

results = self.model.detect([image], verbose=1)
  File "/home/dev02/projects/property_expert_demo/detection/mrcnn/model.py", line 2435, in detect
    self.keras_model.predict([molded_images, image_metas, anchors], verbose=0)
  File "/home/dev02/anaconda3/envs/mask_rcnn/lib/python3.6/site-packages/keras/engine/training.py", line 1832, in predict
    self._make_predict_function()
  File "/home/dev02/anaconda3/envs/mask_rcnn/lib/python3.6/site-packages/keras/engine/training.py", line 1029, in _make_predict_function
    **kwargs)
  File "/home/dev02/anaconda3/envs/mask_rcnn/lib/python3.6/site-packages/keras/backend/tensorflow_backend.py", line 2502, in function
    return Function(inputs, outputs, updates=updates, **kwargs)
  File "/home/dev02/anaconda3/envs/mask_rcnn/lib/python3.6/site-packages/keras/backend/tensorflow_backend.py", line 2445, in __init__
    with tf.control_dependencies(self.outputs):
  File "/home/dev02/anaconda3/envs/mask_rcnn/lib/python3.6/site-packages/tensorflow/python/framework/ops.py", line 4863, in control_dependencies
    return get_default_graph().control_dependencies(control_inputs)
  File "/home/dev02/anaconda3/envs/mask_rcnn/lib/python3.6/site-packages/tensorflow/python/framework/ops.py", line 4481, in control_dependencies
    c = self.as_graph_element(c)
  File "/home/dev02/anaconda3/envs/mask_rcnn/lib/python3.6/site-packages/tensorflow/python/framework/ops.py", line 3478, in as_graph_element
    return self._as_graph_element_locked(obj, allow_tensor, allow_operation)
  File "/home/dev02/anaconda3/envs/mask_rcnn/lib/python3.6/site-packages/tensorflow/python/framework/ops.py", line 3557, in _as_graph_element_locked
    raise ValueError("Tensor %s is not an element of this graph." % obj)
ValueError: Tensor Tensor("mrcnn_detection/Reshape_1:0", shape=(1, 100, 6), dtype=float32) is not an element of this graph.

I think the weights are deleted and the service is trying to call it back.

@zungam I found the answer about an hour ago but thanks for the quick reply! :)
@mtcld check out my answer here for the fix (https://github.com/matterport/Mask_RCNN/issues/588). But in quick you got to do model._make_predict_function()
before you call
model.detect([image], verbose=1)

thanks @zungam and @ItsTehStory
Sorry english is not my first language, I think I just misled you about my question.

Let me clear my question as this:
imagine we have a.py for load the model and weights, b.py just for running the detection on images fed by api.
I know the detection can be run with for-loop within one .py file.
But I just cant figure out how to keep a.py running while b.py being called by API repeatedly.

Maybe I should try to import some framework such as Flask.
But I just want to know if there is someway to do it with scratch development.

Edited: @ItsTehStory thanks for the #588link,
it worked for me.

Just added "model.keras_model._make_predict_function()" before the model.detect call, and with multiprocessing managers I can call the model.detect in the client while loading model and weights in the server side.

And I just tried the implement with Flask to create a REST api, also added the _make_predict_function before calling model.detect, it works like a charm.

Thanks again @ItsTehStory

Great! Happy you got it working!

@chohaku84 can you share how you converted this to flask app? thanks

@siontist here is a simple implementation. hope it will be of some help.

https://github.com/chohaku84/np_detect/blob/master/api/flask_app.py

could you help me how to keep a model of keras loaded in django please

@luisito93
I've never used Django, but I believe that the concept is similar in Django and in Flask :

  • You'd have to have Django route your requests to your liking (lets say /api for this)
  • You'd have to write a myModelStuff.py
  • Write a function that runs the detect/training (although I only tried with detect)
    When you have those, just put your loading of the model out of the function mentioned earlier, so that at import time (I am not a python dev sorry to say stuff like this ^^') , your model will be loaded.
    So in yourDjangoServer.py :
import myModelStuff
# Couldn't find how django routes with just a quick search, I'll use flask as an exemple
@app.route('/api')
def youNameThisWhatYouWant():
    myModelStuff.functionToRunDetection();
    return "your answer here"

I hope this can help you, but I also hope you found your solution during the last 5 days.

Thanks @ItsTehStory

using model.keras_model._make_predict_function() before calling model.detect() worked for me.

In case if you have developed a class and want to initialize the model in the constructor, _make_predict_fucntion() needs to be called in the constructor. Calling that function elsewhere (another function or in flask app) will not work.

As mentioned by @Jargon4072, @chohaku84 and @ItsTehStory

...
    def __init__(self, image_byte=None, image_type=None):
        self.config = coco.CocoConfig()
        self.model = modellib.MaskRCNN(mode="inference", model_dir="./", config=self.config)
        self.model.load_weights(COCO_MODEL_PATH, by_name=True)
        self.model.keras_model._make_predict_function()
...

@satwikk 感谢你的分享解决了我的问题

Was this page helpful?
0 / 5 - 0 ratings