-
Notifications
You must be signed in to change notification settings - Fork 104
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
how to export to onnx? #71
Comments
I have converted the model to onnx success: |
When I export pt to onnx, has this error, Can you tell me how you converted the model to onnx success. |
@xddlj try opset_version = 13 |
请问函数里的参数应该怎么写呢?官方给的pt不知道输入输出shape和名字该怎么转 |
|
转换的官方的kapao_s_coco.pt吗,我按照您的代码,转换提示这个错误: |
对不起,我误会了您的意思,需要用torch框架先把模型加载进来吧? |
最好官方能出个export.py的脚本 |
I converted the model to ONNX with following options:
Conversion seems to be successful. But when i load the model for inference using onnxruntime i get error:
Error:
Was someone able to do inference using onnx runtime ? |
Going through your dependencies to find where exactly you do "onnxruntime.InferenceSession(model_path)" But I could not find where is the code for ModelBase: 'from ...model_base import ModelBase' |
I found some changes that were done to yolov5 github to handle this issue: I guess this is what is the issue during the inference. |
No description provided.
The text was updated successfully, but these errors were encountered: