Skip to content

⚡️ Using NNIE as simple as using ncnn ⚡️

License

Notifications You must be signed in to change notification settings

designer-jqy/NNIE-lite

 
 

Repository files navigation

NNIE-lite

⚡️ Using NNIE as simple as using ncnn ⚡️

Accelerate model's inference time on camera using NNIE.

NNIE 是 Neural Network Inference Engine 的简称,是海思媒体 SoC 中专门针对神经网络特别是深度学习卷积神经网络进行加速处理的硬件单元.

The project is more suitable for algorithm engineer who don't care about the underlying about NNIE's detail or camera system, you can just focus on your model's performance and latency.

===================================================================

Introduce

  • CMake-Based Project

  • This project is straightforward to apply Hi3516CV500、Hi3519AV100

  • Refactor NNIE by C++ & OpenCV


Environment

  • Hisi SDK version

    • Hi3516CV500_SDK_V2.0.0.3
  • Cross Tools

    • 32bit
    • arm-himix200-linux

Recommend to read Related Blog about more detail.


Usage

  • Model Converter

    • A template about converting Caffe_ENet to NNIE_ENet was provided.
  • Deploy

    • The examples provided how to use general classification and general segmentaion.

      • Put MNIST as general classification example.
      • Put ENet as general segmentation example.
    • You may reference the examples to coding your custom task.


Todo list

  • ResNet 18
  • MobileNetv2-yolov2
  • MobileNetv2-yolov3
  • CPM
  • OpenPose

Supported Model


Debug Conveniently

You may use ENet inference cityscapes image, and generate a color mask on camera directly.

A demo output like below and the model is just a toy with few epoch.


Related Blog


References

About

⚡️ Using NNIE as simple as using ncnn ⚡️

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • C++ 98.3%
  • CMake 1.6%
  • Shell 0.1%