-
Is it possible to export these models to ONNX and use with tensorrt? I'm particularly interested in doing so with regnet_32 - is there some example of how to do this? |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 1 reply
-
@LukeAI You can likely leverage code I have in another repository for export to ONNX and validation in ONNX. See the scripts here, and note you should be able to change this line to timm.create_model and export many (most?) of the models here: https://github.com/rwightman/gen-efficientnet-pytorch/blob/master/onnx_export.py#L65 Tensort has further constrains beyond ONNX but I imagine it should still work for several models (haven't tried in a while). Things often break (unrelated to the models) here across different PyTorch/ONNX/TesnorRt releases so expect some turbulence. |
Beta Was this translation helpful? Give feedback.
@LukeAI You can likely leverage code I have in another repository for export to ONNX and validation in ONNX.
See the scripts here, and note you should be able to change this line to timm.create_model and export many (most?) of the models here: https://github.com/rwightman/gen-efficientnet-pytorch/blob/master/onnx_export.py#L65
Tensort has further constrains beyond ONNX but I imagine it should still work for several models (haven't tried in a while). Things often break (unrelated to the models) here across different PyTorch/ONNX/TesnorRt releases so expect some turbulence.