FacefARt is an app for the iPhone X that demonstrates the potential of the TrueDepth front-facing camera and ARKit to make the user's face fart when they open their mouth.
A passion for fart apps is the only thing you need to get started.
I guess you'll also need an iPhone X and Xcode 9, and the ability to load a debug version of the app onto the device.
The FacefARt app works by beginning an ARKit session, running the ARSCNView's session with a ARFaceTrackingConfiguration config. The main view controller is set as the ARSCNView's delegate, which calls the renderer(_:didUpdate:for:)
method, passing along an ARFaceAnchor
which contains information about the various anchor points on the user's face.
We only care about the ARFaceAnchor.BlendShapeLocation.mouthClose
value which gives us the "openness" of the user's mouth. When the user opens their mouth, we start playing a fart sound. When we get an update with the mouthClose
value such that we detect the user has closed their mouth, we stop the fart sound. It's a really long fart sound so if you leave your mouth open it'll fart for a while.
- Creating Face-Based AR Experiences - Apple's sample code on how to use the ARKit Face stuff.
- Hung Truong - hungtruong
This project is licensed under the MIT License - see the LICENSE.md file for details
- Thanks to Apple for not canceling my developer license
- Also thanks to Apple for providing some really good sample code