Using Google Colab to do inference using Intel OpenVINO.
Intel OpenVINO on Google Colab
Step 1: Importing Libraries. Ref: AllModels.ipynb
!pip install openvino
Step 2: Setting up environment. Ref: AllModels.ipynb
from openvino.inference_engine import IENetwork
from openvino.inference_engine import IECore
import warnings
from google.colab.patches import cv2_imshow
warnings.filterwarnings("ignore", category=DeprecationWarning)
def load_IR_to_IE(model_xml):
### Load the Inference Engine API
plugin = IECore()
### Loading the IR files to IENetwork class
model_bin = model_xml[:-3]+"bin"
network = IENetwork(model=model_xml, weights=model_bin)
### Loading the network
executable_net = plugin.load_network(network,"CPU")
print("Network succesfully loaded into the Inference Engine")
return executable_net
def synchronous_inference(executable_net, image):
### Get the input blob for the inference request
input_blob = next(iter(executable_net.inputs))
### Perform Synchronous Inference
result = executable_net.infer(inputs = {input_blob: image})
return result
For use cases refer notebook.
Demo1: Inference Demo
Demo2: IE File Generation and inference
The model descriptions have been borrowed from Intel and code has been adapted from this repository.