Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Cannot convert .onnx file to .blob file using python script and luxonis blob converter takes forever #68

Open
KennethEladistu opened this issue Jun 18, 2024 · 5 comments
Assignees

Comments

@KennethEladistu
Copy link

Hello, currently I am using yolo v8 OBB for this one and want to integrate my model on our Oak-D-Pro, currently I already converted it into ONNX file and is looking for way to convert that .ONNX file into .BLOB file. I already tried 2 ways first is the blob converter I run it for 7 hours and still has not yet converted my file into .BLOB.
Screenshot 2024-06-18 102143

I also tried the code which is

import blobconverter

blob_path = blobconverter.from_onnx(
model="C:/Users/KennethAaronEladistu/Documents/PT weights/best(1).onnx",
data_type="FP16",
shaves=6,
)

and this is the output of it
Screenshot 2024-06-18 102620

@HonzaCuhel
Copy link
Contributor

Hi @KennethEladistu,

Thank you for sharing this issue. Could you please share with use the source .pt and .onnx files so that we might take a closer look?

Best,
Jan

@KennethEladistu
Copy link
Author

Hi @KennethEladistu,

Thank you for sharing this issue. Could you please share with use the source .pt and .onnx files so that we might take a closer look?

Best, Jan

Hello this is the link to my source.pt and .onnx file

https://drive.google.com/drive/folders/1x-IRw6zsS-eKANdP3b352yYiYbQXoY1I?usp=sharing

@HonzaCuhel
Copy link
Contributor

Hi,

Thank you for sharing. I'll look into it and get back to you as soon as I find something.

Best,
Jan

@KennethEladistu
Copy link
Author

Hi,

Thank you for sharing. I'll look into it and get back to you as soon as I find something.

Best, Jan

Thanks!

@HonzaCuhel
Copy link
Contributor

Hi @KennethEladistu,

I am very sorry for such a long delay in my response. The model wasn't converted because of an unsupported Sin operation at the end of the model (as highlighted in the screenshot).

image

That's why this part (post-processing) must be pruned from the model and applied to the output of the pruned model. We are currently implementing the support for the YOLOv8 OBB model in our DepthAI; however, as far as I know, we have a flexible ETA. If you need to use the model as soon as possible, I can export the pruned model for you and give you some pointers to what needs to be implemented to use the model on an OAK device. Another option would be to wait for our official support of this model type.

Again, I am genuinely sorry for the delay in my response.

Kind regards,
Jan

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants