-
Notifications
You must be signed in to change notification settings - Fork 161
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
yolov10端到端部署,运行输出异常 #136
Comments
推理代码是啥? |
就是你的端到端的cpp代码 |
@Linaom1214 你测试yolov10端到端c++代码没有问题吗 |
@zeyad-mansour |
YOLOv10 是一个nms free模型,本项目中的端到端模型只是为了对齐其他需要nms的模型,手动将一输出改为三输出。 只对python端进行了测试 |
端到端以后python和c++都是一样吧,都分别从score、bbox获得结果啊,应该没什么差异。而且yolov10输出都是直接给结果的,还有你说的一输出改为三输出能具体说明吗,没理解这句话 |
在onnx到engine这一步,将原本的结果进行了切分,可以对照export查看。其中num是自己构造的,可能数据类型对不上 |
当前tensorrt版本为8.5.22,v5-v9端到端模型量化已经验证推理正常,但是yolov10推理结果异常。
用最新export.py执行下面yolov10量化代码:
python export.py -o /media/ubuntu/data/project/yolov10/yolov10m.onnx -e yolov10m_coco.engine --end2end -v -p fp16 --v10 --workspace 9000
运行程序后异常输出:
以上可能是什么问题导致的? @Linaom1214
The text was updated successfully, but these errors were encountered: