You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
在mmdeploy转换onnx推理时[Bug] Type Error: Type parameter (T) of Optype (Where) bound to different types (tensor(int64) and tensor(float) in node (/Where_11).
#2869
Open
3 tasks done
amxl56 opened this issue
Jan 12, 2025
· 0 comments
I have searched related issues but cannot get the expected help.
2. I have read the FAQ documentation but cannot get the expected help.
3. The bug has not been fixed in the latest version.
Describe the bug
UserWarning: The exported ONNX model failed ONNX shape inference. The model will not be executable by the ONNX Runtime. If this is unintended and you believe there is a bug, please report an issue at https://github.com/pytorch/pytorch/issues. Error reported by strict ONNX shape inference: [ShapeInferenceError] (op_type:Where, node name: /Where_11): Y has inconsistent type tensor(float) (Triggered internally at C:\cb\pytorch_1000000000000\work\torch\csrc\jit\serialization\export.cpp:1421.)
_C._check_onnx_proto(proto)
2.
Type Error: Type parameter (T) of Optype (Where) bound to different types (tensor(int64) and tensor(float) in node (/Where_11).
3.
[2025-01-12 16:49:55.372] [mmdeploy] [info] [model.cpp:35] [DirectoryModel] Load model: "mmdeploy_model/faster-rcnn/"
[2025-01-12 16:49:55.614] [mmdeploy] [error] [ort_net.cpp:205] unhandled exception when creating ORTNet: Type Error: Type parameter (T) of Optype (Where) bound to different types (tensor(int64) and tensor(float) in node (/Where_11).
[2025-01-12 16:49:55.614] [mmdeploy] [error] [net_module.cpp:54] Failed to create Net backend: onnxruntime, config: {
"context": {
"device": "",
"model": "",
"stream": ""
},
"input": [
"prep_output"
],
"input_map": {
"img": "input"
},
"is_batched": true,
"module": "Net",
"name": "fasterrcnn",
"output": [
"infer_output"
],
"output_map": {},
"type": "Task"
}
[2025-01-12 16:49:55.614] [mmdeploy] [error] [task.cpp:99] error parsing config: {
"context": {
...
],
"output_map": {},
"type": "Task"
}
4.
sess = C.InferenceSession(session_options, self._model_path, True, self._read_config_from_model)
onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Load model from mmdeploy_model/faster-rcnn\end2end.onnx failed:Type Error: Type parameter (T) of Optype (Where) bound to different types (tensor(int64) and tensor(float) in node (/Where_11).
01/12 16:51:10 - mmengine - ERROR - mmdeploy/tools/deploy.py - create_process - 82 - visualize onnxruntime model failed.
[ONNXRuntimeError] : 1 : FAIL : Load model from mmdeploy_model/faster-rcnn/end2end.onnx failed:Type Error: Type parameter (T) of Optype (Where) bound to different types (tensor(int64) and tensor(float) in node (/Where_11)
Checklist
Describe the bug
UserWarning: The exported ONNX model failed ONNX shape inference. The model will not be executable by the ONNX Runtime. If this is unintended and you believe there is a bug, please report an issue at https://github.com/pytorch/pytorch/issues. Error reported by strict ONNX shape inference: [ShapeInferenceError] (op_type:Where, node name: /Where_11): Y has inconsistent type tensor(float) (Triggered internally at C:\cb\pytorch_1000000000000\work\torch\csrc\jit\serialization\export.cpp:1421.)
_C._check_onnx_proto(proto)
2.
Type Error: Type parameter (T) of Optype (Where) bound to different types (tensor(int64) and tensor(float) in node (/Where_11).
3.
[2025-01-12 16:49:55.372] [mmdeploy] [info] [model.cpp:35] [DirectoryModel] Load model: "mmdeploy_model/faster-rcnn/"
[2025-01-12 16:49:55.614] [mmdeploy] [error] [ort_net.cpp:205] unhandled exception when creating ORTNet: Type Error: Type parameter (T) of Optype (Where) bound to different types (tensor(int64) and tensor(float) in node (/Where_11).
[2025-01-12 16:49:55.614] [mmdeploy] [error] [net_module.cpp:54] Failed to create Net backend: onnxruntime, config: {
"context": {
"device": "",
"model": "",
"stream": ""
},
"input": [
"prep_output"
],
"input_map": {
"img": "input"
},
"is_batched": true,
"module": "Net",
"name": "fasterrcnn",
"output": [
"infer_output"
],
"output_map": {},
"type": "Task"
}
[2025-01-12 16:49:55.614] [mmdeploy] [error] [task.cpp:99] error parsing config: {
"context": {
...
],
"output_map": {},
"type": "Task"
}
4.
sess = C.InferenceSession(session_options, self._model_path, True, self._read_config_from_model)
onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Load model from mmdeploy_model/faster-rcnn\end2end.onnx failed:Type Error: Type parameter (T) of Optype (Where) bound to different types (tensor(int64) and tensor(float) in node (/Where_11).
01/12 16:51:10 - mmengine - ERROR - mmdeploy/tools/deploy.py - create_process - 82 - visualize onnxruntime model failed.
[ONNXRuntimeError] : 1 : FAIL : Load model from mmdeploy_model/faster-rcnn/end2end.onnx failed:Type Error: Type parameter (T) of Optype (Where) bound to different types (tensor(int64) and tensor(float) in node (/Where_11)
Reproduction
from mmdeploy.apis import inference_model
model_cfg = 'mmdetection/configs/faster_rcnn/faster-rcnn_r50_fpn_1x_coco.py'
deploy_cfg = 'mmdeploy/configs/mmdet/detection/detection_onnxruntime_dynamic.py'
backend_files = ['mmdeploy_model/faster-rcnn/end2end.onnx']
img = 'mmdetection/demo/demo.jpg'
device = 'cpu'
result = inference_model(model_cfg, deploy_cfg, backend_files, img, device)
!python mmdeploy/demo/python/object_detection.py cpu mmdeploy_model/faster-rcnn/ mmdetection/demo/demo.jpg
!python mmdeploy/tools/deploy.py
mmdeploy/configs/mmdet/detection/detection_onnxruntime_dynamic.py
mmdetection/configs/faster_rcnn/faster-rcnn_r50_fpn_1x_coco.py
mmdetection/checkpoints/faster_rcnn_r50_fpn_1x_coco_20200130-047c8118.pth
mmdetection/demo/demo.jpg
--work-dir mmdeploy_model/faster-rcnn
--device cpu
--dump-info
Environment
Error traceback
No response
The text was updated successfully, but these errors were encountered: