I'm forking this MetalCamera repo to use on my personal project, more specifically. I want to achieve realtime background replacement using the segmentation map CoreML service. On top of background replacement, I also want to investigate video composition with masks, as well as adding filters to specific masked areas.
- Crop out humans and objects from the background
- Composite an image to the back of the cropped face
- Add filter to the cropped face
- Add hue filter
- Replace background with another video
- Add ability to pass uniform to the shader functions
- Composite 2 cropped out videos onto one background
Below is the original README
MetalCamera is an open source project for performing GPU-accelerated image and video processing on Mac and iOS.
There are many ways to use the GPU, including CIFilter, but it's not open or difficult to expand feature and contribute.
The main goal of this repository is to provide an interface and test performance to develop and apply it to actual services more easily when you have an idea about image processing and machine learning in the iOS environment.
At this stage, I'm developing to provide the following functions simply.
- Camera input/output Handling
- Save image frame to video
- Basic image processing and filter
- Download and processing CoreML model
- Visualize result of CoreML model
- Benchmark algorithm.
There are still a lot of bugs and many things to implement, but I created a repository because I wanted to develop camera and vision feature in iOS with many people.
Feel free to use, make some issue and PR when you have a idea.
Thanks.
To run the example project, clone the repo, and run pod install
from the Example directory first.
import MetalCamera
@IBOutlet weak var preview: MetalVideoView!
var camera: MetalCamera!
override func viewDidLoad() {
super.viewDidLoad()
guard let camera = try? MetalCamera(useMic: useMic) else { return }
camera-->preview
self.camera = camera
}
override func viewWillAppear(_ animated: Bool) {
super.viewWillAppear(animated)
camera?.startCapture()
}
override func viewDidDisappear(_ animated: Bool) {
super.viewDidDisappear(animated)
camera?.stopCapture()
}
import MetalCamera
let url = URL(string: "https://ml-assets.apple.com/coreml/models/Image/ImageSegmentation/DeepLabV3/DeepLabV3Int8LUT.mlmodel")!
do {
coreMLLoader = try CoreMLLoader(url: url, isForcedDownload: true)
coreMLLoader?.load({ (progress) in
debugPrint("Model downloading.... \(progress)")
}, { (loadedModel, error) in
if let loadedModel = loadedModel {
debugPrint(loadedModel)
} else if let error = error {
debugPrint(error)
}
})
} catch {
debugPrint(error)
}
func loadCoreML() {
do {
let modelURL = URL(string: "https://ml-assets.apple.com/coreml/models/Image/ImageSegmentation/DeepLabV3/DeepLabV3Int8LUT.mlmodel")!
let loader = try CoreMLLoader(url: modelURL)
loader.load { [weak self](model, error) in
if let model = model {
self?.setupModelHandler(model)
} else if let error = error {
debugPrint(error)
}
}
} catch {
debugPrint(error)
}
}
func setupModelHandler(_ model: MLModel) {
do {
let modelHandler = try CoreMLClassifierHandler(model)
camera.removeTarget(preview)
camera-->modelHandler-->preview
} catch{
debugPrint(error)
}
}
let rotation90 = RotationOperation(.degree90_flip)
let imageCompositor = ImageCompositor(baseTextureKey: camera.textureKey)
guard let testImage = UIImage(named: "sampleImage") else {
fatalError("Check image resource")
}
let gray = Gray()
let compositeFrame = CGRect(x: 50, y: 100, width: 250, height: 250)
imageCompositor.addCompositeImage(testImage)
imageCompositor.sourceFrame = compositeFrame
videoCompositor = ImageCompositor(baseTextureKey: camera.textureKey)
videoCompositor.sourceFrame = CGRect(x: 320, y: 100, width: 450, height: 250)
camera-->rotation90-->gray-->imageCompositor-->videoCompositor-->preview
- Lookup Filter
do {
if FileManager.default.fileExists(atPath: recordingURL.path) {
try FileManager.default.removeItem(at: recordingURL)
}
recorder = try MetalVideoWriter(url: recordingURL, videoSize: CGSize(width: 720, height: 1280), recordAudio: useMic)
if let recorder = recorder {
preview-->recorder
if useMic {
camera==>recorder
}
recorder.startRecording()
}
} catch {
debugPrint(error)
}
- Swift 5
- Xcode 11.5 or higher on Mac
- iOS: 13.0 or higher
MetalCamera is available through CocoaPods. To install it, simply add the following line to your Podfile:
pod 'MetalCamera'
When creating this repository, I referenced the following repositories a lot. First of all, thanks to those who have worked and opened many parts in advance, and let me know if there are any problems.
jsharp83, [email protected]
MetalCamera is available under the MIT license. See the LICENSE file for more info.