TRAINED BREVITAS MODEL CAN NOT BE EXPORTED TO ONNX #403
Unanswered
mllearner98
asked this question in
Q&A
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I have QNN model trained in brevitas. My input shape for a data is (1,3,1024). As I do predictions for my model, I get an accuracy about %80 and on the other hand I am also able to plot a confusion matrix. So; I am sure about my model works. As I want to export this model to onnx in order to use it with FINN I get the error at below.
Please give me any ideas about that.
Sincerely
Beta Was this translation helpful? Give feedback.
All reactions