There are all manner of weird and wonderful control systems being invented to help drone pilots guide their unmanned aerial vehicles through the skies. One that sounds pretty intuitive, though, is laid out in a new piece of research from engineers at New York University, the University of Pennsylvania, and the U.S. Army Research Laboratory. They have invented a method to allow drone pilots to fly using a pair of eye-tracking glasses. What could be simpler?
“This solution provides the opportunity to create new, non-invasive forms of interactions between a human and robots allowing the human to send new 3D-navigation waypoints to the robot in an uninstrumented environment,” Dr. Giuseppe Loianno, assistant professor at New York University and Director of the Agile Robotics and Perception Lab, told Digital Trends. “The user can control the drone just pointing at a spatial location using his gaze, which is distinct from the head orientation in our case.”
The method is both easy to use and self-contained. In terms of hardware, it requires the drone (obviously!), a small computational unit and a pair of Tobii Pro Glasses 2. These gaze-tracking glasses boast an inertial measurement unit (IMU) and a built-in HD camera. Using some smart deep neural network technology and head orientation data from the IMU, the glasses are able to detect where the user is looking and how far away the drone is.
The researchers’ hope is that such technology could be used to aid people with little drone-flying experience to safely fly them without the need for an expert pilot.
“The proposed solution opens up new ways to interpret human attention and create new anticipative human-robot interfaces,” Loianno continued. “We aim to create new ways of interaction between agents. Specifically, we are interested to develop a multi-modal interaction setup — [featuring] visual, vocal [and gesture-based interactions] — and add multiple agents in the framework. We would also [like to] investigate the benefits that the proposed solution can provide to people affected by body or ocular diseases.”
A paper describing the work, titled “Human Gaze-Driven Spatial Tasking of an Autonomous MAV,” was recently submitted to the 2019 International Conference on Robotics and Automation, which will take place in May 2019. Make sure you keep your eyes trained in that direction!