As all long-time fans of The Avengers are aware, even an android can cry. But does that mean robots can feel? It may depend on how you define the term, as a new “skin” for a robotic arm, created by a team at the Georgia Institute of Technology in Atlanta, certainly fulfills one understanding of the idea. The invention contains hundreds of individual sensors planted all the way up its length to help it navigate without bumping into any obstacles that may appear along its path.
The 384 sensors are placed throughout what Charlie Kemp and colleagues call “flexible electronic skin” that covers an arm developed by Meka Robotics, a company based in San Francisco. Using information collected by the sensors embedded throughout, the arm can navigate through a space using its sense of “touch” to identify its immediate surroundings and calculate the most appropriate course of movement. These calculations are derived by using an algorithm Kemp and his team developed.
While the arm tends to move dependent on the shortest distance possible – or, alternatively, pre-set motions determined by the user – Kemp’s electronic skin and algorithm allow for real-time reactions to obstacles, with the Meka arm’s “springy joints” allowing for smooth and instantaneous reactions.
Kemp and his team are obviously not keeping this creation to themselves. Next month, they’re headed to give a presentation about recent tests – in which a quadriplegic man managed to manipulate the arm using head motions and got the arm to hold a cloth and wipe his face with it – at the International Conference on Rehabilitation Robotics in Seattle, Washington. Additionally, the team has also made the software (and information about the sensors themselves) behind their program public to see if others can help improve it.
“We have released our sensors as open hardware with the intention of supporting researchers and hobbyists, although anyone is welcome to use our designs,” Kemp explained on the website of Healthcare Robotics, the organization behind the sensor skin. “We hope that the sensors and accompanying software will make it easier for people to build on our research.”
That same website explains the goal of the project. “We are attempting to create a new foundation for robot manipulation that encourages contact between the robot’s arm and the world.” Beyond that, the success of the most recent tests suggest that Kemp and team’s work will also manage to create new foundations between those without limbs and the world on a wider sense.