The project’s official title is Ready-Aim-Fly. This describes the three separate phases required to achieve successful flight. In the “Ready” stage, the modified Parrot Bebop drone learns the user’s face by asking them to hold a neutral expression. You then have to program in a trigger face which is distinct from your neutral expression. The “Aim” phase involves the drone beginning to fly, while making sure that it keeps the user centered in its camera view. The user can then line up their desired trajectory and move away from the drone in order to increase how far it will fly — a bit like pulling the string back on a catapult. Finally, the “Fly” stage sees the drone perform its programmed trajectory.
It’s possible to use your expressions to make the drone travel in a straight line, or return to the user like a boomerang. In tests, the Ready-Aim-Fly system was used to dispatch the drone on flights of close to 150 feet outdoors. In all, the idea is pretty darn wacky, but we kind of love the concept of being able to harness recent breakthroughs in image recognition (and particularly facial recognition) in order to issue non-verbal commands to a robot.
“The demo is cute and small scale, but we are serious about this interface,” Richard Vaughan, one of the researchers on the project, told Digital Trends. “The important part is that the robot flies in a parabola, as if it was an object being thrown. People are really good at throwing things, so the interaction is easy to learn in one demonstration. The ability to place a drone in 3D from a quick interaction is new and powerful. With a little practice, one can send the robot over a building, or onto its roof, in a couple of seconds of aiming. The user carries no special equipment and doesn’t need their hands free. Since we did this work, we are now able to read facial expressions very accurately — using the same techniques as the iPhone X animated emojis — so we can send off the robot with a big smile.”
A paper describing the project, titled “Ready-Aim-Fly! Hands-Free Face-Based HRI for 3D Trajectory Control of UAVs,” was presented at the Institute of Electrical and Electronics Engineers Canadian Conference on Computer and Robot Vision.