Following https://github.com/facebookresearch/maskrcnn-benchmark/issues/27, I am opening a new issue since it's not totally related.
It would be nice to be able to export the model to ONNX for later reuse with different frameworks and/or languages.
I think it might be difficult to have our detection models be ONNX-exportable in the near future.
The reason being that they have a few custom operators, and those operators are not in the ONNX standard.
Adding them to the ONNX standard could be possible, but I believe this requires some time and discussion between different teams. Also, having other frameworks implement those operators is yet another story.
It looks like some people are already working on it:
That's good to know. Once ONNX supports the required ops, we can see what can be done on our side.
After this PR, onnx will probably have everything we need https://github.com/onnx/onnx/pull/1010
hi @hadim
have u exported the model to ONNX?
can u give me some tips how to do it?
thanks
https://github.com/facebookresearch/maskrcnn-benchmark/pull/138 needs to be fixed first.
Can the models be exported now?
The ONNX models repo contains a pre-trained Faster R-CNN model from maskrcnn-benchmark. However, it does not seem to have been exported directly.
现在可以导出模型吗?
ONNX模型存储库包含来自maskrcnn-benchmark 的预训练的Faster R-CNN模型。但是,它似乎没有直接导出。
Have you converted successfully?
Most helpful comment
After this PR, onnx will probably have everything we need https://github.com/onnx/onnx/pull/1010