Skip to content

OpenVINO™ Execution Provider for ONNXRuntime 5.8

Latest

Choose a tag to compare

@vthaniel vthaniel released this 13 Oct 12:50
d70213d

Description:
OpenVINO™ Execution Provider For ONNXRuntime v5.8 Release based on the latest OpenVINO™ 2025.3.0 Release and OnnxRuntime 1.23.0 Release

For all the latest information, Refer to our official documentation:
https://onnxruntime.ai/docs/execution-providers/OpenVINO-ExecutionProvider.html

This release supports ONNXRuntime 1.23.0 with the latest OpenVINO™ 2025.3.0 release.
Please refer to the OpenVINO™ Execution Provider For ONNXRuntime build instructions for information on system pre-requisites as well as instructions to build from source.
https://onnxruntime.ai/docs/build/eps.html#openvino

Modifications:

  • Supports OpenVINO 2025.3.0.
  • Added support for inferencing dynamic shaped models using reshape_input provider option.
  • Enable setting of model layouts.
  • Performance Optimizations by removing unintended model copies during compilation.
  • Reduced peak memory usage by optimizing fallback logic and model proto handling.
  • ORT GenAI is now supported using OpenVINO EP using enable_causallm provider option as True.
  • ORT now supports EpContext Models with OVIR (i.e. model.xml & model.bin) stored into ep_cache_context attribute.
  • Quantization enhancements by using adaptive stripping.
  • Added QDQ scale propagation pass.
  • Enabled QDQ Channel Wise Quantization for Intel NPU friendly quantization.
  • Added support for HardSwish , SimplifiedLayerNormalization operators.

Samples:
https://github.com/microsoft/onnxruntime-inference-examples

Python Package:
https://pypi.org/project/onnxruntime-openvino/

Installation and usage Instructions on Windows:

pip install onnxruntime-openvino

/* Steps If using python openvino package to set openvino runtime environment */
pip install openvino==2025.3.0
<Add these 2 lines in the application code>
import onnxruntime.tools.add_openvino_win_libs as utils
utils.add_openvino_libs_to_path()

C# Package:
Download the Microsoft.ML.OnnxRuntime.Managed nuget from the link below
https://www.nuget.org/packages/Microsoft.ML.OnnxRuntime.Managed
and use it with the Intel.ML.OnnxRuntime.OpenVino nuget from the link below
https://www.nuget.org/packages/Intel.ML.OnnxRuntime.OpenVino

ONNXRuntime APIs usage:
Please refer to the link below for Python/C++ APIs:
https://onnxruntime.ai/docs/execution-providers/OpenVINO-ExecutionProvider.html#configuration-options