Skip to main content

Scientists develop imaging tech to help 3D cameras see in bright light

Homogeneous Codes for Energy-Efficient Illumination and Imaging
Wish you could use your Microsoft Xbox Kinect in bright light? Apparently, so did a team of researchers from Carnegie Mellon University and the University of Toronto. The computer scientists looked into why bright light and sunlight cause depth-sensing cameras like the Kinect to fail, and recently presented their findings and solutions at the SIGGRAPH 2015 conference (via RedOrbit.com) earlier this month.

While current 3D sensors in cameras like the Kinect search for all data, or light points, the researchers have developed an imaging technology that gathers only the bits of light the camera needs. When a camera does that, it can eliminate extra light or noise. Using a mathematical formula, the program is able to process data from the camera and renders the image, even when it’s taken in brighter environments; the formula can work in bright light, reflective or diffused light, or even through smoke.

Recommended Videos

“We have a way of choosing the light rays we want to capture and only those rays,” says Srinivasa Narashiman, a CMU associate professor of robotics, in a university statement. “We don’t need new image-processing algorithms and we don’t need extra processing to eliminate the noise, because we don’t collect the noise. This is all done by the sensor.”

Depth-sensing 3D cameras like the Microsoft Kinect are easily overwhelmed by bright light, say researchers from Carnegie Mellon University and the University of Toronto. They've developed a technology that projects a pattern onto an object or subject, which helps it determine the 3D contours under bright light.
Depth-sensing 3D cameras like the Microsoft Kinect are easily overwhelmed by bright light, say researchers from Carnegie Mellon University and the University of Toronto. They’ve developed a technology that projects a pattern onto an object or subject, which helps it determine the 3D contours under bright light. Carnegie Mellon University

The researchers, explaining how depth cameras work, used a low-power laser to project “a pattern of dots or lines over a scene. Depending on how these patterns are deformed or how much time it takes light to reflect back to the camera, it is possible to calculate the 3D contours of the scene.” It is unclear whether the technology can be implemented into existing depth-sensing cameras with a software update.

The new research could open the technology for additional applications, or enhance existing ones such as medical imaging, inspection of shiny parts, and sensing for robots in space, among others. The researchers say the technology could also be incorporated into smartphones, making the imaging technology accessible to more common uses.

Enid Burns
Former Digital Trends Contributor
Enid Burns is a freelance writer who has covered consumer electronics, online advertising, mobile, technology electronic…
GoPro launches ultralight, affordable Hero 4K Camera for $199
The 2024 GoPro hero is frozen in ice.

GoPro enthusiasts have a new camera to consider after the company introduced its miniature, ultralight 4K Hero late last week. It is the company's smallest and most affordable offering, costing just $199.

The Hero is waterproof and combines GoPro's simplest user interface with 4K video, 2x slo-mo at 2.7K resolution, and 12-megapixel photos. It is available on retail shelves around the world and online at GoPro's website.

Read more
The best camera phones in 2024: our top 9 photography picks
A person holding the Samsung Galaxy S24 Ultra and Xiaomi 14 Ultra.

In the past decade or so, cameras on smartphones have evolved so much that they can pretty much replace a standalone digital camera for most people. The results you can get on some of the best smartphones these days are just so impressive, and being able to be with you at all times means you'll never miss a moment.

But what if you want the best possible camera phone money can buy? A camera that won't let you down no matter what you're taking a picture of? You've come to the right place. Here are the very best camera phones you can buy in 2024.

Read more
An ace photographer is about to leave the ISS. Here are his best shots
The moon and Earth as seen from the ISS.

NASA astronaut Matthew Dominick is preparing to return to Earth after spending seven months living and working aboard the International Space Station (ISS).

After arriving at the orbital outpost, Dominick -- who is on his first mission to space -- quickly earned a reputation for being an ace photographer. He's been using the facility’s plethora of high-end cameras and lenses to capture amazing shots from his unique vantage point some 250 miles above Earth. Sharing his content on social media, the American astronaut has always been happy to reveal how he captured the imagery and offer extra insight for folks interested to know more.

Read more