Skip to content

derekelewis/ObjectDetection-CoreML

 
 

Repository files navigation

ObjectDetection-CoreML

supporting models: YOLOv8, YOLOv5, YOLOv3, MobileNetV2+SSDLite

platform-ios swift-version lisence

This project is Object Detection on iOS with Core ML.
If you are interested in iOS + Machine Learning, visit here you can see various DEMOs.
SSDMobileNetV2-DEMO

Requirements

  • Xcode 10.3+
  • iOS 13.0+
  • Swift 4.2

How To Build and Run the Project

1. Clone the project

git clone https://github.com/tucan9389/ObjectDetection-CoreML

2. Prepare Core ML model

  • You can download COCO models or another model from here

Or if you want to make and use model with custom dataset,

  1. follow roboflow tutorial from scratch or yolov5 repo's tutorial
  2. and convert the .pt model to .mlmodel model with our issue.

3. Add the model to the project

By default, the project uses the yolov8s model. If you want to use another model, you can replace the model file in the project.

Screen Shot 2022-09-03 at 9 48 43 AM

Screen Shot 2022-09-03 at 9 46 19 AM

4. Set model name properly in ViewController.swift

image

5. Build and Run

How To Run with your own model

1. Convert your model to Core ML

At this moment(23.04.08), there is error when converting yolov8 models to Core ML. Once ultralytics/ultralytics#1791 is merged, you can use the following steps. (Or you can use this PR alternatively.)

Pre-requirements

pip install ultralytics
pip install coremltools

Option 1) With shell

yolo export model=yolov8n.pt format=coreml nms

Option 2) With python script

# mian.py
from ultralytics import YOLO

if __name__ == '__main__':
    model = YOLO("yolov8n.pt", task='detect')  # load a pretrained model
    model.overrides['nms'] = True
    success = model.export(format="coreml")  # export the model to CoreML format
# in terminal
python main.py
# then you can see the `.mlpackage` or `.mlmodel` file in your current directory
# (btw you can check your current directory with `pwd` command)

2. Follow the steps above from Step 3

Models

Model Matadata

image

Model Size, Minimum iOS Version, Download Link

Model Size
(MB)
Minimum
iOS Version
Download
Link
Trained Dataset
yolov8n 12.7 iOS14 Link
yolov8s 44.7 iOS14 Link
yolov8m 52.1 iOS14 Link
yolov8l 87.8 iOS14 Link
yolov8x 272.9 iOS14 Link
yolov5n.mlmodel 7.52 iOS13 Link COCO
yolov5s.mlmodel 28.0 iOS13 Link COCO
yolov5m.mlmodel 81.2 iOS13 Link COCO
yolov5l.mlmodel 178.0 iOS13 Link COCO
yolov5x.mlmodel 331.0 iOS13 Link COCO
yolov5n6.mlmodel 12.8 iOS13 Link COCO
yolov5s6.mlmodel 48.5 iOS13 Link COCO
yolov5m6.mlmodel 137.0 iOS13 Link COCO
yolov5l6.mlmodel 293.0 iOS13 Link COCO
yolov5x6.mlmodel 537.0 iOS13 Link COCO
YOLOv3.mlmodel 248.4 iOS12 Link COCO
YOLOv3FP16.mlmodel 124.2 iOS12 Link COCO
YOLOv3Int8LUT.mlmodel 62.2 iOS12 Link COCO
YOLOv3Tiny.mlmodel 35.5 iOS12 Link COCO
YOLOv3TinyFP16.mlmodel 17.8 iOS12 Link COCO
YOLOv3TinyInt8LUT.mlmodel 8.9 iOS12 Link COCO
MobileNetV2_SSDLite.mlmodel 9.3 iOS12 Link COCO
ObjectDetector.mlmodel 63.7 iOS12 Link 6 Label Dataset

Trained Dataset Infos

COCO Dataset
6 Label Dataset(Apple's DEMO)
  • Bagel
  • Banana
  • Coffee
  • Croissant
  • Egg
  • Waffle

Performance

Build Setting:
Xcoede > Build Settings > Apple Clang - Code Generation > Optimization Level > Fastest [-O3]

Screen Shot 2022-09-05 at 4 31 08 PM

Infernece Time (ms)

Model vs. Device 14
Pro
13
Pro
12
Pro
11
Pro
XS XS
Max
XR X 7+ 7
yolov8n 15
yolov8s 29
yolov8m 37
yolov8l 45
yolov8x 51
yolov5n 24
yolov5s 29
yolov5m 39
yolov5l 38
yolov5x 69
yolov5n6 24
yolov5s6 34
yolov5m6 39
yolov5l6 41
yolov5x6 57
YOLOv3 45 83 108 93 100 356 569 561
YOLOv3FP16 44 84 104 89 101 348 572 565
YOLOv3Int8LUT 53 86 101 92 100 337 575 572
YOLOv3Tiny 36 44 46 41 47 106 165 168
YOLOv3TinyFP16 33 44 51 41 44 103 165 167
YOLOv3TinyInt8LUT 39 44 45 39 39 106 160 161
MobileNetV2_SSDLite 17 18 31 31 31 109 141 134
ObjectDetector 13 18 24 26 23 63 86 84

Total Time (ms)

Model vs. Device 14
Pro
13
Pro
12
Pro
11
Pro
XS XS
Max
XR X 7+ 7
yolov8n 15
yolov8s 31
yolov8m 39
yolov8l 47
yolov8x 52
yolov5n 26
yolov5s 31
yolov5m 41
yolov5l 39
yolov5x 72
yolov5n6 25
yolov5s6 36
yolov5m6 41
yolov5l6 42
yolov5x6 59
YOLOv3 46 84 108 93 100 357 569 561
YOLOv3FP16 45 85 104 89 101 348 572 565
YOLOv3Int8LUT 54 86 102 92 102 338 576 573
YOLOv3Tiny 37 45 46 42 48 106 166 169
YOLOv3TinyFP16 35 45 51 41 44 104 165 167
YOLOv3TinyInt8LUT 41 45 45 39 40 107 160 161
MobileNetV2_SSDLite 19 19 32 31 32 109 142 134
ObjectDetector 14 18 25 26 23 64 87 85

FPS

Model vs. Device 14
Pro
13
Pro
12
Pro
11
Pro
XS XS
Max
XR X 7+ 7
yolov8n 38
yolov8s 14
yolov8m 14
yolov8l 14
yolov8x 13
yolov5n 19
yolov5s 14
yolov5m 13
yolov5l 14
yolov5x 7
yolov5n6 19
yolov5s6 14
yolov5m6 13
yolov5l6 14
yolov5x6 13
YOLOv3 12 9 8 10 9 2 1 1
YOLOv3FP16 13 9 9 10 8 2 1 1
YOLOv3Int8LUT 14 9 9 10 9 2 1 1
YOLOv3Tiny 14 14 21 22 20 8 5 5
YOLOv3TinyFP16 14 14 19 23 21 9 5 5
YOLOv3TinyInt8LUT 11 14 21 24 23 8 5 5
MobileNetV2_SSDLite 19 29 23 23 23 8 6 6
ObjectDetector 17 29 23 23 24 14 10 11

See also

About

An example running Object Detection using Core ML

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Swift 100.0%