Unlike traditional processors, IBM’s TrueNorth utilizes 4,096 microcomputing cores to form a single processor. The idea is that it operates a bit more like how the human brain handles information, with the hope that it will be more efficient. In many ways it is, as this incredibly low-power chip is already proving more than capable for the job Samsung wants it to do.
Put together at the Samsung Advanced Institute of Technology, the Dynamic Vision Sensor with TrueNorth at its core is able to process video images very quickly and efficiently, because it does things differently.
Each pixel on the sensor operates entirely independently and only reports back if it detects a change. That means static scenes, or those with less movement, require far less resources to be rendered, even in video. This is what makes it possible for cameras equipped with the sensor to take slow-motion video to the tune of 2,000 frames per second, more than 15 times what the average high-speed camera is capable of.
But more than just taking shake video of your new puppy, these sorts of sensors can have far-reaching capabilities. Automated safety features, especially in autonomous vehicles, could make use of such a sensor, as could cameras that recognize motion controls and gestures.
As CNET explains, the demo Samsung used to show off the technology showcases a person controlling their TV using gestures as diverse as a closed fist, split fingers, waves, and pinches. That’s far more advanced that we’ve seen with similar tracking technologies in recent years from the likes of Microsoft’s Kinect camera.
The military is also sniffing around the technology, which offers potential for drones and remote camera systems to detect unusual changes in video streams.