Android Face-Tracking Playground
One of my first sample apps when I started on Android was a book library and, at the time I used a barcode scanner third party library.
It was only later, in late 2015, that Google announced the Mobile Vision API as part of the Google Play Services API that included the vision.barcode package. Well, I never used barcode package but I made a mental note to try the vision.face package. And from that note, this post.
Vision Face API
The vision face API allows developers to detect human faces and track face landmarks from an image and/or video in a reasonable simple way. Its usage requires some simple components:
- Camera permission. I will not discuss how Android runtime permissions shall be requested and handled in this post. If you need more info on this topic, check the Android developer training.
- Camera Source. This is simply the image or video source.
- Face Detector. The face detector takes the image/video source and applies the face detection algorithms.
- Face Tracker. The face tracker is associated to the face detector and will be the place where we receive and process the detection events.
I want to note that the face detector can work with different kind of sources and in different structures.
The one I will describe corresponds to a more generic pipeline structure where, we associate a stream source (e.g. camera source) and a processor that tracks and reacts to any detection event relevant to our application.
The face detector could also work synchronously, for instance, given a static frame. But I will not discuss this in this short-attention-span article.
Let’s build a face detection pipeline
We can see how this all works in a simple example I published on face-tracking-sample app on GitHub. This app controls several views with the blink of your eyes using the few main components enumerated above.
The Face Tracker
We said the face tracker receives the detection events over time. For our example, we’ll create a tracker that will react whenever the user blinks either of the eyes.
FaceTracker extends the callback vision Tracker class, used to receive notifications for detected items, in this case, Face items.
There are four methods we could override but for our simple app we only override the onUpdate() method to receive the updates on a detected face item.
The face detector is able to detect multiple faces and the Tracker callback methods allow to get identifiers for the different detected faces. In this example and to keep it simple, we don’t care about different faces.
The Face Detector
The API provides a very simple interface to create a face detector using the associated builder class.
Once the detector is created, we need to set the tracker as its processor that will receive and react to the detection events.
mFaceDetector.setProcessor(new LargestFaceFocusingProcessor(mFaceDetector, new FaceTracker()));
We have now a working detector.
The Camera Source
To complete the detection pipeline, we need a source that feeds our face detector. We will use the front camera as source and will create it using the provided builder API.
Face detection can run on lower resolutions. Our sample app is meant to track prominent faces — only one user is suppose to be blinking in front of the camera. In that case we can use low resolutions, improving the detection speed without affecting the performance. For applications where smaller faces need to be detected, the resolution might have a lower bound.
And that’s it! The only thing left here is implementing the actions to the detected events so that your app reacts to the user faces.
Sample App
I was pleasantly surprise about the simplicity of the vision API and how easy it is to create a fully functional app that leverages facial detection, tracks the user’s eyes using the front camera and reacts to “a blink of an eye”.
- Check the full app source code in my face-tracking-sample GitHub project, or;
- download it from the PlayStore beta-channel.
The app requires Google Play Services for it to work.
The app is tested in Nexus 5 and 5X devices APIs 23 and 24
I you liked the article, click the 💚 below so other people will see it here on Medium.