How to start AR Face Tracking in less than 10 minutes!

How to start AR Face Tracking in less than 10 minutes!

Hey all, today i'll be showing you a simple AR face tracking app that you can download and play around with to further your understanding of face tracking with ARKit.

If you want to watch the tutorial instead, click this link Youtube Tutorial

Else, for this tutorial, I am assuming you are familiar with XCode and Swift but are new to AR. I recommend you download the source code to follow these instructions.

Let's begin!

Step 1 : New project > Augmented Reality Application

Screen Shot 2021-03-02 at 2.12.43 PM.png

Step 2 : Navigate to ViewController.swift

Screen Shot 2021-03-02 at 10.35.06 AM.png

Step 3 : Import SceneKit, Import ARKit

Screen Shot 2021-03-02 at 11.42.38 AM.png

Step 4: Add ARSCNViewDelegate Protocol to class

Screen Shot 2021-03-02 at 11.42.53 AM.png

Step 5 : Create a scene variable using ARSCNView

Screen Shot 2021-03-02 at 11.43.07 AM.png

Step 6 : Create a UILabel

Screen Shot 2021-03-02 at 11.43.13 AM.png

Step 7 : Create an empty string variable

Screen Shot 2021-03-02 at 11.43.19 AM.png

Step 8 : Create viewDidLoad() and set the sceneView delegate

Screen Shot 2021-03-02 at 11.46.53 AM.png

Step 9 : Create viewWillAppear()

Step 10 : Create a session configuration using ARFaceTrackingConfiguration()

Step 11 : Run the configuration

Screen Shot 2021-03-02 at 11.47.11 AM.png

Step 12 : Create a viewWillDisappear function

Screen Shot 2021-03-02 at 11.47.23 AM.png

Step 13: Create a renderer method to provide a new SCNNode object

Screen Shot 2021-03-02 at 2.24.11 PM.png

Step 14 : Create a ARSCNFaceGeometry object

Screen Shot 2021-03-02 at 2.25.48 PM.png

Step 15 : Create a SCNNode and feed it the object

Screen Shot 2021-03-02 at 2.26.10 PM.png

Step 16 : Make the SCNNode geometry fill with lines

Screen Shot 2021-03-02 at 2.05.20 PM.png

Step 17: Create a renderer method to tell the delegate that a SceneKit node's properties have been updated

Screen Shot 2021-03-02 at 2.29.49 PM.png

Step 18 : Add an if statement to detect everytime the face mesh has a change

Screen Shot 2021-03-02 at 2.33.10 PM.png

Step 19 : Run a method we will add after this function that takes an ARFaceAnchor as a prameter

Screen Shot 2021-03-02 at 2.35.01 PM.png

Step 20 : Update the bottomLabel on the main thread

Screen Shot 2021-03-02 at 11.50.03 AM.png

Step 21: Create a readMyFace function that takes an ARFaceAnchor as a parameter

Screen Shot 2021-03-02 at 2.37.10 PM.png

Step 22 : Define different anchors utilizing classes in the imported kits

Screen Shot 2021-03-02 at 2.39.10 PM.png

Step 23 : Set the bottomText to "You are still faced" as soon as the function is implemented

Screen Shot 2021-03-02 at 2.39.50 PM.png

Step 24 : Create different if statements that check for changes in the anchor values and updates the bottomText value accordingly

Screen Shot 2021-03-02 at 11.51.47 AM.png

And you are done!

Run the app and you should see your face covered in a geometrical mesh, with a bottom label reading your facial expressions.

Thank you for following this tutorial and I wish you a nice day!

Repository : github.com/khal0160/ARFaceTracking.git

Youtube video : youtu.be/eeA06alRcMs

Thank you