-
Notifications
You must be signed in to change notification settings - Fork 323
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Next Ops to work on #922
Comments
Working on compress |
Working on NonMaxSuppression |
FYI, here are some of the benchmarks we are focusing on and that have ops that are not working yet. high priority: (from model zoo)
high priority: support compile models compiled to their lowest component (like RNNs not exported as high level ONNX ops). No crash. medium prio: hugging face GBERTQnA A list of ops currently not supported and present in Model Zoo are listed at the end of this issue #128 |
I am going to look at |
working on one hot to work with multiple types |
working on Hardmax to support Bidaf. PR #950 (merged). |
working on Resize. |
Working on IsNaN op |
Working on |
Implemented |
Working on |
Status of implemented ops are listed here now: https://github.com/onnx/onnx-mlir/blob/main/docs/SupportedONNXOps-cpu.md |
Hi, thank you for your excellent work! |
@airMeng please go ahead with a PR for |
Hi, can I work on celu op ? |
Somebody please support QuantizeLinear/DequantizeLinear ops for quantized networks. |
is anyone working on extending the |
@srcarroll CustomOps is a non-standard op that @chentong319 added to easily convert a custom op into a function call. While Tong knows best, my recollection (and I may be wrong here) is that custom ops are mainly generated within onnx-mlir, and not parsed in from an ONNX protobuf. Tong is away for a bit, if you wanted to add support for more custom ops, we would certainly be interested in taking in the changes. There are also ONNX functions, maybe that may help too. |
@AlexandreEichenberger thanks for the response. i'd be happy add support, but I can't find any info on the definition of could you also point me to the ONNX functions you are referring to and how to emit them? thanks |
I could not find info about your "InPlaceAccumulatorV2", it does not appears in the ONNX specs or what I could find about the OnnxRuntimeExtensions, though I may have overlooked something in the ORT as I am not very familiar with it. How did you create the ONNX graph? As far as creating custom ONNX functions, you can see a reference here in the ORT literature as a preferred way to make new custom ops: https://onnxruntime.ai/docs/reference/operators/add-custom-op.html The ONNX specs also have a section on ONNX functions: https://onnx.ai/onnx/intro/concepts.html#functions |
Idea: put here a quick comment to claim the operations that you care currently working on, so that we do not replicate work.
Can also add a request for new op.
The text was updated successfully, but these errors were encountered: