Skip to main content

This camera that sees in real time could mean safer driverless cars and drones

ntu celex real time camera image 1 asst prof chen shoushun 729
Nanyang Technological University Singapore
Driverless cars, drones, and other unmanned vehicles can only react to potential hazards if they “see” them fast enough. A team from Nanyang Technological University in Singapore recently developed a camera called Celex with enough of a speed boost to see in real time.

Conventional video cameras record as many as a few hundred images per second, strung together to create a video. While cameras are getting faster, current options are limited based on how quickly the computer can make sense of all that data and process that many large files. Essentially, the camera sees the information, but the computer inside cannot process it fast enough.

Recommended Videos

The research group from NTU developed a camera that records those changes in light in nanoseconds, instead of traditional frames, allowing the system to adjust to light changes faster than a typical camera. Instead of taking a large number of photos per second to create a video feed, Celex instead doesn’t concentrate on an entire image, but only reads the changes between the previous view at each pixel. Since the camera is only processing changes instead of an entire image, the speed of the camera is dramatically increased.

The camera also uses a built-in computer to analyze what is in the foreground, or what is close to the camera, and what is in the background. This optical flow computation helps the system determine what is part of the moving scenery and what is actually moving on its own toward a potential collision path.

The research team, led by assistant professor Chen Shoushun, says the camera system is also better than traditional options for night driving, as well as driving in bad weather, because of the onboard circuit that processes all the data. “Our new camera can be a great safety tool for autonomous vehicles, since it can see very far ahead like optical cameras but without the time lag needed to analyze and process the video feed,” Shoushun said. “With its continuous tracking feature and instant analysis of a scene, it complements existing optical and laser cameras and can help self-driving vehicles and drones avoid unexpected collisions that usually happen within seconds.”

Of course, since the camera focuses only on changes to keep file sizes small, the technology isn’t something that will eventually wind up in consumer cameras. But the enhanced speed could help increase safety in applications where the camera serves as a pair of eyes and not as an artistic tool, like in driverless cars and drones.

Work on Celex started in 2009 and the group launched a startup based on the technology. According to the researchers, the system, now in its final prototype stage, could hit the market before the end of 2017.

Hillary K. Grigonis
Hillary never planned on becoming a photographer—and then she was handed a camera at her first writing job and she's been…
MIT’s shadow-watching tech could let autonomous cars see around corners
mit shadow look around corners sensing 0

Whether it’s cyclists about to lurch into the wrong lane or a pedestrian getting ready to cross the street, self-driving cars need to be hyperaware of what is going on around them at all times. But one thing they can’t do is to see around corners. Or can they? In a paper presented at this week’s International Conference on Intelligent Robots and Systems (IROS), Massachusetts Institute of Technology researchers have shown off technology which could allow autonomous vehicles or other kinds of robots to do exactly that -- by looking for changes in shadows on the ground to reveal if a moving object is headed their way.

"ShadowCam operates by detecting small differences in shadows and using this information to detect possible static and dynamic objects that are otherwise out of your line of sight," Alexander Amini and Igor Gilitschenski, two MIT Computer Science and Artificial Intelligence Laboratory (CSAIL) researchers who worked on the project, told Digital Trends via email. "First of all, we need to focus on the same region of interest as we move, which we achieve by integrating a visual motion estimation technique into ShadowCam. Based on this stabilized image of the region of interest, we [then] use color amplification combined with a dynamic threshold on the intensity changes."

Read more
These new NASA EVs will drive astronauts part way to the moon (sort of)
NASA's new crew transportation electric vehicles.

Three specially designed, fully electric, environmentally friendly crew transportation vehicles for Artemis missions arrived at NASA’s Kennedy Space Center in Florida this week. The zero-emission vehicles, which will carry astronauts to Launch Complex 39B for Artemis missions, were delivered by Canoo Technologies of Torrance, California. NASA/Isaac Watson

NASA has shown off a trio of new all-electric vehicles that will shuttle the next generation of lunar astronauts to the launchpad at the Kennedy Space Center.

Read more
5 upcoming EVs I’m excited for, from luxury SUVs to budget champions
Lotus Eletre

Almost every major automaker has released an EV by now -- or plans to soon -- and makers like Ford and Kia already have a variety to choose from. But if you haven't found one that's right for you yet, hang tight. There are dozens of announced electric car models that have yet to come out, and it's clear that the future of EVs is bright.

From longer range to lower prices, the next batch of EVs gives us plenty to get excited about. Here are five upcoming EVs that we can't wait to drive.
Volvo EX30

Read more