Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

在mmdeploy转换onnx推理时[Bug] Type Error: Type parameter (T) of Optype (Where) bound to different types (tensor(int64) and tensor(float) in node (/Where_11). #2869

Open
3 tasks done
amxl56 opened this issue Jan 12, 2025 · 0 comments

Comments

@amxl56
Copy link

amxl56 commented Jan 12, 2025

Checklist

  • I have searched related issues but cannot get the expected help.
  • 2. I have read the FAQ documentation but cannot get the expected help.
  • 3. The bug has not been fixed in the latest version.

Describe the bug

UserWarning: The exported ONNX model failed ONNX shape inference. The model will not be executable by the ONNX Runtime. If this is unintended and you believe there is a bug, please report an issue at https://github.com/pytorch/pytorch/issues. Error reported by strict ONNX shape inference: [ShapeInferenceError] (op_type:Where, node name: /Where_11): Y has inconsistent type tensor(float) (Triggered internally at C:\cb\pytorch_1000000000000\work\torch\csrc\jit\serialization\export.cpp:1421.)
_C._check_onnx_proto(proto)
2.
Type Error: Type parameter (T) of Optype (Where) bound to different types (tensor(int64) and tensor(float) in node (/Where_11).
3.
[2025-01-12 16:49:55.372] [mmdeploy] [info] [model.cpp:35] [DirectoryModel] Load model: "mmdeploy_model/faster-rcnn/"
[2025-01-12 16:49:55.614] [mmdeploy] [error] [ort_net.cpp:205] unhandled exception when creating ORTNet: Type Error: Type parameter (T) of Optype (Where) bound to different types (tensor(int64) and tensor(float) in node (/Where_11).
[2025-01-12 16:49:55.614] [mmdeploy] [error] [net_module.cpp:54] Failed to create Net backend: onnxruntime, config: {
"context": {
"device": "",
"model": "",
"stream": ""
},
"input": [
"prep_output"
],
"input_map": {
"img": "input"
},
"is_batched": true,
"module": "Net",
"name": "fasterrcnn",
"output": [
"infer_output"
],
"output_map": {},
"type": "Task"
}
[2025-01-12 16:49:55.614] [mmdeploy] [error] [task.cpp:99] error parsing config: {
"context": {
...
],
"output_map": {},
"type": "Task"
}
4.
sess = C.InferenceSession(session_options, self._model_path, True, self._read_config_from_model)
onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Load model from mmdeploy_model/faster-rcnn\end2end.onnx failed:Type Error: Type parameter (T) of Optype (Where) bound to different types (tensor(int64) and tensor(float) in node (/Where_11).
01/12 16:51:10 - mmengine - ERROR - mmdeploy/tools/deploy.py - create_process - 82 - visualize onnxruntime model failed.

[ONNXRuntimeError] : 1 : FAIL : Load model from mmdeploy_model/faster-rcnn/end2end.onnx failed:Type Error: Type parameter (T) of Optype (Where) bound to different types (tensor(int64) and tensor(float) in node (/Where_11)

Reproduction

from mmdeploy.apis import inference_model

model_cfg = 'mmdetection/configs/faster_rcnn/faster-rcnn_r50_fpn_1x_coco.py'
deploy_cfg = 'mmdeploy/configs/mmdet/detection/detection_onnxruntime_dynamic.py'
backend_files = ['mmdeploy_model/faster-rcnn/end2end.onnx']
img = 'mmdetection/demo/demo.jpg'
device = 'cpu'
result = inference_model(model_cfg, deploy_cfg, backend_files, img, device)

!python mmdeploy/demo/python/object_detection.py cpu mmdeploy_model/faster-rcnn/ mmdetection/demo/demo.jpg

!python mmdeploy/tools/deploy.py
mmdeploy/configs/mmdet/detection/detection_onnxruntime_dynamic.py
mmdetection/configs/faster_rcnn/faster-rcnn_r50_fpn_1x_coco.py
mmdetection/checkpoints/faster_rcnn_r50_fpn_1x_coco_20200130-047c8118.pth
mmdetection/demo/demo.jpg
--work-dir mmdeploy_model/faster-rcnn
--device cpu
--dump-info

Environment

mmcv==2.1.0
torch==2.1.0
onnxruntime==1.15.1

Error traceback

No response

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant