How shall we approach missing Ops in TorchSharp? #3
Replies: 2 comments 1 reply
-
There seems to be no magic. All you have to do is to export all the extra ops functions as a normal dll/so file and put that into |
Beta Was this translation helpful? Give feedback.
-
Need your view on these Contributed Ops in Onnxruntime
These are curated Ops contributed, AFAIK, from industry partners. Under what circumstances would you think it make sense to bring these ops to Torchsharp? Many of these Ops are contributed Deep NLP tokenizers. Perhaps some of which are related to your listed ones Introduce more tokenizers
I still do not have the complete overview yet, I am not sure if these Contributed Ops are accessible through c# Onnxruntime API or they are limited to Python Onnxruntime API. I appreciate your effort to bring more native ops to Torchsharp and show us the possible workflow how to do that. |
Beta Was this translation helpful? Give feedback.
-
TorchSharp
dotnet/TorchSharp#790
dotnet/TorchSharp#1081
dotnet/TorchSharp#867
dotnet/TorchSharp#610
onnxruntime-extensions
microsoft/onnxruntime-extensions#468
@K024
I still have not developed an overview with Ops in TorchSharp
We really appreciate your effort to support TorchSharp community in this aspect.
If you have idea the framework we need to work towards a community effort how to approach the missing Ops, please share and help us organize this topic better.
Beta Was this translation helpful? Give feedback.
All reactions