- December 17, 2024
- Mins Read
TL;DR: Use face detection for calculating distance and angle of your phone.
Here is a demo where this is used for setting the shadow and size of a text.
For a longer demo see: https://www.youtube.com/watch?v=yLAtc7AzjIk
Earlier this month I had a discussion with a colleague about new forms of user interaction. On mobile devices there are now many applications that are using the compass and or accelerator. On a game consoles it’s very common to use a camera to interact with a game (Xbox Kinect and Playstation move) I thought it would be nice if we could use the front facing camera of a phone for interacting with an app. I had to see if this idea could be implemented.
Since iOS 5 Apple has added a face detection api. My idea was to detect a face and estimate the distance the device is from your face based on the size of the detected face. If the rectangle of the detected face is big, then the device is close to your face, if it’s small, then the device is far away from your face. With a similar technique it is also possible to calculate the angle of the device in relation to the face. This angle is calculated based on how fare the detected face rectangle is away from the center of the screen.
The challenge here was to hook into the video stream and detect the face for each frame. Fortunately Apple has a sample project for this called SquareCam. With the code from that project it’s possible to detect faces in about 5 frames per second (on an iPhone 4S)
There are a couple of fun things that you could do with face tracking:
‘EVFaceTracker’ is now available through the dependency manager CocoaPods. You can install cocoapods by executing:
[sudo] gem install cocoapods
If you have installed cocoapods, then you can just add EVFaceTracker to your workspace by adding the following line to your Podfile:
pod “EVFaceTracker”
You can also just copy EVFaceTracker.m and .h to your project.
– (void)viewDidLoad {
[super viewDidLoad];
// Start tracking your face.
evFaceTracker = [[EVFaceTracker alloc] initWithDelegate:self];
// And give us a smooth update 10 times per second.
[evFaceTracker fluidUpdateInterval:0.1f withReactionFactor:0.5f];
}
#pragma mark – <EVFaceTrackerDelegate>
// This delegate method is called every time the face recognition has detected something (including change)
– (void)faceIsTracked:(CGRect)faceRect withOffsetWidth:(float)offsetWidth andOffsetHeight:(float)offsetHeight andDistance:(float) distance {
[CATransaction begin];
[CATransaction setAnimationDuration:0.2];
CALayer *layer = dynamicLabel.layer;
layer.masksToBounds = NO;
layer.shadowOffset = CGSizeMake(offsetWidth / 5.0f, offsetHeight / 10.0f);
layer.shadowRadius = 5;
layer.shadowOpacity = 0.5;
[CATransaction commit];
}
// When the fluidUpdateInterval method is called, then this delegate method will be called on a regular interval
– (void)fluentUpdateDistance:(float)distance {
// Animate to the zoom level.
float effectiveScale = distance / 60.0f;
[CATransaction begin];
[CATransaction setAnimationDuration:0.1f];
[dynamicView.layer setAffineTransform:CGAffineTransformMakeScale(effectiveScale, effectiveScale)];
[CATransaction commit];
}
A vertical stackview which takes subviews with different widths and adds them to it's rows with paddings, spacings etc.
AudioManager is a Swift package that provides a modular and easy-to-use interface for implementing audio feedback in your applications. It ...