Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

error in ms_deformable_im2col_cuda: invalid configuration argument #4260

Open
nainaigetuide opened this issue Nov 25, 2024 · 4 comments
Open

Comments

@nainaigetuide
Copy link

Description

The following error occurred during the process of converting the onnx file to engine on jetson orin. For specific logs, see the attached file

The execution command is
/usr/src/tensorrt/bin/trtexec --onnx=./checkpoints/onnx/bevformer_small_epoch_24_cp2_add_cast.onnx --saveEngine=bevformer_small_epoch_24_cp2_add_cast.engine --fp16 --int8 --plugins=/home/orin/disk/BEVFormer_tensorrt/TensorRT/lib/libtensorrt_ops.so --dumpLayerInfo --profilingVerbosity=detailed --exportLayerInfo=layers.json --verbose

The error message is
‘error in ms_deformable_im2col_cuda: invalid configuration argument’

When I use tensorrt8.6, the above error does not occur. So I think it may be strongly related to tensorrt version.

Environment

TensorRT Version: 10.3

NVIDIA GPU: jetson orin

CUDA Version:12.2

Jetpack Version:12.2

Operating System: 22.04

Baremetal or Container (if so, version): nvcr.io/nvidia/l4t-jetpack:r36.4.0

Relevant Files

Model link: https://drive.google.com/file/d/1rAQWJJt5qaCkxevlAnzmCgnUYfXzCHoB/view?usp=sharing

Steps To Reproduce

  1. Pull the image https://catalog.ngc.nvidia.com/orgs/nvidia/containers/l4t-jetpack/tags
  2. The custom plugin is compiled to generate libtensorrt_ops.so. (Use the plugin in the following link
    https://github.com/DerryHub/BEVFormer_tensorrt/tree/main/TensorRT/plugin)
  3. /usr/src/tensorrt/bin/trtexec --onnx=./checkpoints/onnx/bevformer_small_epoch_24_cp2_add_cast.onnx --saveEngine=bevformer_small_epoch_24_cp2_add_cast.engine --fp16 --int8 --plugins=/home/orin/disk/BEVFormer_tensorrt/TensorRT/lib/libtensorrt_ops.so --dumpLayerInfo --profilingVerbosity=detailed --exportLayerInfo=layers.json --verbose
@nainaigetuide
Copy link
Author

@lix19937
Copy link

invalid configuration argument means nearly cuda/kernel has error, and you has better to recompile libtensorrt_ops in different trt versions.

@nainaigetuide
Copy link
Author

My libtensorrt_ops was recompiled and generated in the environment of tensorrt 10.3. But this error is still reported. Why is it? Thank you. @lix19937

@lix19937
Copy link

You can do a unit test for your plugin.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants