Skip to content

Swift, Metal based image and video processing on iOS

License

Notifications You must be signed in to change notification settings

jingxin-software/MetalCamera

 
 

Repository files navigation

MetalCamera

Version License Platform

Dennis' Fork

I'm forking this MetalCamera repo to use on my personal project, more specifically. I want to achieve realtime background replacement using the segmentation map CoreML service. On top of background replacement, I also want to investigate video composition with masks, as well as adding filters to specific masked areas.

RoadMap

  • Crop out humans and objects from the background
  • Composite an image to the back of the cropped face
  • Add filter to the cropped face
  • Add hue filter
  • Replace background with another video
  • Add ability to pass uniform to the shader functions
  • Composite 2 cropped out videos onto one background

Below is the original README

Motivation

MetalCamera is an open source project for performing GPU-accelerated image and video processing on Mac and iOS.

There are many ways to use the GPU, including CIFilter, but it's not open or difficult to expand feature and contribute.

The main goal of this repository is to provide an interface and test performance to develop and apply it to actual services more easily when you have an idea about image processing and machine learning in the iOS environment.

At this stage, I'm developing to provide the following functions simply.

  • Camera input/output Handling
  • Save image frame to video
  • Basic image processing and filter
  • Download and processing CoreML model
  • Visualize result of CoreML model
  • Benchmark algorithm.

There are still a lot of bugs and many things to implement, but I created a repository because I wanted to develop camera and vision feature in iOS with many people.

Feel free to use, make some issue and PR when you have a idea.

Thanks.

Example

To run the example project, clone the repo, and run pod install from the Example directory first.

Camera

    
import MetalCamera    
@IBOutlet weak var preview: MetalVideoView!
var camera: MetalCamera!
    
override func viewDidLoad() {
    super.viewDidLoad()
    guard let camera = try? MetalCamera(useMic: useMic) else { return }
    camera-->preview
    self.camera = camera
}
    
override func viewWillAppear(_ animated: Bool) {
    super.viewWillAppear(animated)
    camera?.startCapture()
}

override func viewDidDisappear(_ animated: Bool) {
    super.viewDidDisappear(animated)
    camera?.stopCapture()
}    

Download and load CoreML from web url

import MetalCamera  

let url = URL(string: "https://ml-assets.apple.com/coreml/models/Image/ImageSegmentation/DeepLabV3/DeepLabV3Int8LUT.mlmodel")!

do {
    coreMLLoader = try CoreMLLoader(url: url, isForcedDownload: true)
    coreMLLoader?.load({ (progress) in
        debugPrint("Model downloading.... \(progress)")
    }, { (loadedModel, error) in
        if let loadedModel = loadedModel {
            debugPrint(loadedModel)
        } else if let error = error {
            debugPrint(error)
        }
    })
} catch {
    debugPrint(error)
}

Segmentation Test(DeepLabV3Int8LUT model, iPhone XS, avg 63ms)

Segmentation

func loadCoreML() {
    do {
        let modelURL = URL(string: "https://ml-assets.apple.com/coreml/models/Image/ImageSegmentation/DeepLabV3/DeepLabV3Int8LUT.mlmodel")!    
        let loader = try CoreMLLoader(url: modelURL)
        loader.load { [weak self](model, error) in
            if let model = model {
                self?.setupModelHandler(model)
            } else if let error = error {
                debugPrint(error)
            }
        }
    } catch {
        debugPrint(error)
    }
}

func setupModelHandler(_ model: MLModel) {
    do {
        let modelHandler = try CoreMLClassifierHandler(model)
        camera.removeTarget(preview)
        camera-->modelHandler-->preview
    } catch{
        debugPrint(error)
    }
}

Composite images or video and Rotation

demo

let rotation90 = RotationOperation(.degree90_flip)

let imageCompositor = ImageCompositor(baseTextureKey: camera.textureKey)
guard let testImage = UIImage(named: "sampleImage") else {
    fatalError("Check image resource")
}

let gray = Gray()

let compositeFrame = CGRect(x: 50, y: 100, width: 250, height: 250)
imageCompositor.addCompositeImage(testImage)
imageCompositor.sourceFrame = compositeFrame

videoCompositor = ImageCompositor(baseTextureKey: camera.textureKey)
videoCompositor.sourceFrame = CGRect(x: 320, y: 100, width: 450, height: 250)

camera-->rotation90-->gray-->imageCompositor-->videoCompositor-->preview

Filter

  • Lookup Filter

lookup

Recording video and audio

do {
    if FileManager.default.fileExists(atPath: recordingURL.path) {
        try FileManager.default.removeItem(at: recordingURL)
    }
     
    recorder = try MetalVideoWriter(url: recordingURL, videoSize: CGSize(width: 720, height: 1280), recordAudio: useMic)
    if let recorder = recorder {
        preview-->recorder
        if useMic {
            camera==>recorder
        }                   
                   
        recorder.startRecording()

    }
} catch {
    debugPrint(error)
}

Requirements

  • Swift 5
  • Xcode 11.5 or higher on Mac
  • iOS: 13.0 or higher

Installation

MetalCamera is available through CocoaPods. To install it, simply add the following line to your Podfile:

pod 'MetalCamera'

References

When creating this repository, I referenced the following repositories a lot. First of all, thanks to those who have worked and opened many parts in advance, and let me know if there are any problems.

Author

jsharp83, [email protected]

License

MetalCamera is available under the MIT license. See the LICENSE file for more info.

About

Swift, Metal based image and video processing on iOS

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Languages

  • Swift 93.2%
  • Metal 5.6%
  • Ruby 1.2%