-
I have created a conformer model and fastconformer model during training time I have got ~2X speed up for the fastconformer model. But When I convert the model to Riva and take inference I get similar inference time for conformed and fastconformer models. Do you have any clue about it? Thanks |
Beta Was this translation helpful? Give feedback.
Answered by
mehadi92
Aug 27, 2024
Replies: 1 comment 5 replies
-
After upgrading the onnx runtime version I get the performance improvement |
Beta Was this translation helpful? Give feedback.
5 replies
Answer selected by
mehadi92
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
After upgrading the onnx runtime version I get the performance improvement