hi, would you assist/help in order to export/convert at least 1 model into the onnx format ?
https://onnx.ai
kind
Hi @WilliamTambellini, have you tried to follow the standard ONNX procedure for converting a PyTorch model?
The model in this repo are just regular PyTorch models.
Hello Thomas, I ve not yet tried, just seen :
https://github.com/onnx/models/issues/130
https://stackoverflow.com/questions/54220042/how-do-you-generate-an-onnx-representation-of-a-pytorch-bert-pretrained-neural-n
Will try, tks.
Hi, when I try to export a TokenClassification model to a ONNX model, I encounter RuntimeError: ONNX export failed: Couldn't export operator aten::erf, does that mean some part of BERT model layers not supported by ONNX?
I think that problem comes from the definition of GELU function, which is x * 0.5 * (1.0 + torch.erf(x / math.sqrt(2.0))). Should I try to use other way to calculate this function or wait for ONNX to support this opertator?
@geekboood update your pytorch version to latest and the problem will most likely go away.
This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.
for anyone who is looking for the answer:
torch=1.1.0
python=3.6
torch.onnx.export(model, (input_ids, segment_ids, input_mask), "bert.onnx", verbose=False)
works well for me
for anyone who is looking for the answer:
torch=1.1.0
python=3.6
torch.onnx.export(model, (input_ids, segment_ids, input_mask), "bert.onnx", verbose=False)works well for me
Hi, thanks for the answer. Do you get good results when using the exported model for inference in another framework? I exported a BertForQuestionAnswering model to ONNX without errors, but I'm getting wrong predictions when using onnxruntime or a second export to TF Serving and I can't figure out why!
Not sure if this is still an issue for you but in the BertForSequenceClassification model the parameters are in a different order
torch.onnx.export(model, (input_ids, input_mask, segment_ids), "bert.onnx", verbose=False)
works as intended
@chessgecko wow you're right, thanks! working now
cc @mfuntowicz :)
Most helpful comment
for anyone who is looking for the answer:
torch=1.1.0
python=3.6
torch.onnx.export(model, (input_ids, segment_ids, input_mask), "bert.onnx", verbose=False)works well for me