You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Has anyone successfully exported DEIM to TensorRT or ONNX with dynamic batch sizes? While export_onnx.py and trtexec work for exporting the model, I get an error related to the model architecture ('/model/decoder/GatherElements') during batch inference with both the ONNX and TensorRT engine files. I used the following trtexec command for the export:
trtexec --onnx=model.onnx --saveEngine=model.trt --minShapes=images:1x3x640x640,orig_target_sizes:1x2 --optShapes=images:1x3x640x640,orig_target_sizes:1x2 --maxShapes=images:32x3x640x640,orig_target_sizes:32x2 --fp16
My input shapes are correct (e.g., for a batch size of 2: images: torch.Size([2, 3, 640, 640]), orig_target_sizes: torch.Size([2, 2]))."
This is the error with onnx:
2025-03-21 09:46:45.304331171 [E:onnxruntime:, sequential_executor.cc:516 ExecuteKernel] Non-zero status code returned while running GatherElements node. Name:'/model/decoder/GatherElements' Status Message: GatherElements op: 'indices' shape should have values within bounds of 'data' shape. Invalid value in indices shape is: 2
This is the error with trt:
[03/21/2025-09:16:43] [TRT] [E] IExecutionContext::executeV2: Error Code 7: Internal Error (/model/decoder/GatherElements: The extent of dimension 0 of indices must be less than or equal to the extent of data. Condition '<' violated: 2 >= 1. Instruction: CHECK_LESS 2 1.)
The text was updated successfully, but these errors were encountered:
This fork wholebody28 branch only has some parameters customized for learning Wholebody28. Also, ONNX/TensorRT optimization. https://github.com/PINTO0309/DEIM
Has anyone successfully exported DEIM to TensorRT or ONNX with dynamic batch sizes? While export_onnx.py and trtexec work for exporting the model, I get an error related to the model architecture ('/model/decoder/GatherElements') during batch inference with both the ONNX and TensorRT engine files. I used the following trtexec command for the export:
trtexec --onnx=model.onnx --saveEngine=model.trt --minShapes=images:1x3x640x640,orig_target_sizes:1x2 --optShapes=images:1x3x640x640,orig_target_sizes:1x2 --maxShapes=images:32x3x640x640,orig_target_sizes:32x2 --fp16
My input shapes are correct (e.g., for a batch size of 2: images: torch.Size([2, 3, 640, 640]), orig_target_sizes: torch.Size([2, 2]))."
This is the error with onnx:
2025-03-21 09:46:45.304331171 [E:onnxruntime:, sequential_executor.cc:516 ExecuteKernel] Non-zero status code returned while running GatherElements node. Name:'/model/decoder/GatherElements' Status Message: GatherElements op: 'indices' shape should have values within bounds of 'data' shape. Invalid value in indices shape is: 2
This is the error with trt:
[03/21/2025-09:16:43] [TRT] [E] IExecutionContext::executeV2: Error Code 7: Internal Error (/model/decoder/GatherElements: The extent of dimension 0 of indices must be less than or equal to the extent of data. Condition '<' violated: 2 >= 1. Instruction: CHECK_LESS 2 1.)
The text was updated successfully, but these errors were encountered: