Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

how to export to onnx? #71

Open
xinsuinizhuan opened this issue Jul 28, 2022 · 15 comments
Open

how to export to onnx? #71

xinsuinizhuan opened this issue Jul 28, 2022 · 15 comments

Comments

@xinsuinizhuan
Copy link

No description provided.

@ykk648
Copy link

ykk648 commented Jul 29, 2022

I have converted the model to onnx success:
torch.onnx.export(model, img, './test.onnx', verbose=True, opset_version=opset_version, input_names=input_names, output_names=output_names, dynamic_axes=dynamic_axes)

@xddlj
Copy link

xddlj commented Aug 24, 2022

I have converted the model to onnx success: torch.onnx.export(model, img, './test.onnx', verbose=True, opset_version=opset_version, input_names=input_names, output_names=output_names, dynamic_axes=dynamic_axes)

When I export pt to onnx, has this error, Can you tell me how you converted the model to onnx success.
RuntimeError: Exporting the operator silu to ONNX opset version 11 is not supported. Please open a bug to request ONNX export support for the missing operator.

@ykk648
Copy link

ykk648 commented Aug 25, 2022

@xddlj try opset_version = 13

@PaulX1029
Copy link

I have converted the model to onnx success: torch.onnx.export(model, img, './test.onnx', verbose=True, opset_version=opset_version, input_names=input_names, output_names=output_names, dynamic_axes=dynamic_axes)

请问函数里的参数应该怎么写呢?官方给的pt不知道输入输出shape和名字该怎么转

@ykk648
Copy link

ykk648 commented Nov 11, 2022

def torch2onnx(model_, input_, output_name="./test.onnx"):
    input_names = ["input_1"]
    output_names = ["output_1"]
    opset_version = 13
    dynamic_axes = None
    # dynamic_axes = {'input_1': [0, 2, 3], 'output_1': [0, 1]}
    torch.onnx.export(model_, input_, output_name, verbose=True, opset_version=opset_version,
                      input_names=input_names, output_names=output_names,
                      dynamic_axes=dynamic_axes, do_constant_folding=True)
    raise 'convert done !'

@PaulX1029

@PaulX1029
Copy link

转换的官方的kapao_s_coco.pt吗,我按照您的代码,转换提示这个错误:
Traceback (most recent call last): File "/mnt/sda/AI/kapao-master/export_xzw.py", line 18, in <module> torch2onnx(model_path, img, output_name) File "/mnt/sda/AI/kapao-master/export_xzw.py", line 11, in torch2onnx dynamic_axes=dynamic_axes, do_constant_folding=True) File "/mnt/sda/AI/miniconda3/envs/yolov5/lib/python3.7/site-packages/torch/onnx/__init__.py", line 276, in export custom_opsets, enable_onnx_checker, use_external_data_format) File "/mnt/sda/AI/miniconda3/envs/yolov5/lib/python3.7/site-packages/torch/onnx/utils.py", line 94, in export use_external_data_format=use_external_data_format) File "/mnt/sda/AI/miniconda3/envs/yolov5/lib/python3.7/site-packages/torch/onnx/utils.py", line 676, in _export with select_model_mode_for_export(model, training): File "/mnt/sda/AI/miniconda3/envs/yolov5/lib/python3.7/contextlib.py", line 112, in __enter__ return next(self.gen) File "/mnt/sda/AI/miniconda3/envs/yolov5/lib/python3.7/site-packages/torch/onnx/utils.py", line 38, in select_model_mode_for_export is_originally_training = model.training AttributeError: 'str' object has no attribute 'training'

@PaulX1029
Copy link

image
@ykk648

@PaulX1029
Copy link

对不起,我误会了您的意思,需要用torch框架先把模型加载进来吧?

@ykk648
Copy link

ykk648 commented Nov 14, 2022

@PaulX1029

@xinsuinizhuan
Copy link
Author

xinsuinizhuan commented Nov 14, 2022

最好官方能出个export.py的脚本

@nikhilchh
Copy link

I converted the model to ONNX with following options:

im = torch.randn(1, 3, 640, 640).type_as(next(model.parameters()))


torch.onnx.export(
        model.cpu(),
        im.cpu(),
        "kapao.onnx",
        verbose=False,
        opset_version=12,
        do_constant_folding=True,  
        input_names=['images'],
        output_names=['output'],
        dynamic_axes=None)

Conversion seems to be successful. But when i load the model for inference using onnxruntime i get error:

session = ort.InferenceSession(model_path)

Error:

onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Load model from kapao.onnx failed:Node (Mul_2329) Op (Mul) [ShapeInferenceError] Incompatible dimensions

Was someone able to do inference using onnx runtime ?

@nikhilchh
Copy link

nikhilchh commented Sep 6, 2023

@ykk648

Going through your dependencies to find where exactly you do "onnxruntime.InferenceSession(model_path)"

But I could not find where is the code for ModelBase:

'from ...model_base import ModelBase'

@nikhilchh
Copy link

I found some changes that were done to yolov5 github to handle this issue:

ultralytics/yolov5#2982

I guess this is what is the issue during the inference.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants