Skip to main content

How crowdsourced lidar could give your car X-ray-like superpowers

One of my uncles always tells a story about how, when he was a kid, his mom would tell him she could see around corners. They would be out walking someplace, only for his mom to tell him the details of some vehicle or person that was about to appear around a bend in the road. A few seconds later and, sure enough, that vehicle or person would appear, exactly as described. Magic, surely?

Of course, it wasn’t magic at all: His mom — my grandmother — was just taller than he was, and could see over walls and other obstacles that he wasn’t able to. What appeared to be some kind of superpower was really just about having a superior vantage point.

Now, researchers from the U.K.’s University of Cambridge, University of Oxford, and University College London want to give every car on the planet the ability to see around corners. And, with genuine magic being in short supply, they’ve come up with a way to shift the world’s vantage points using a combination of lidar, augmented reality, and crowdsourcing.

If it works as promised — and that’s a big if — it could totally transform the way we drive by allowing drivers to “see through” objects to alert them of potential hazards, without distracting them in the process. And it will also “beam” the information directly into your eye for good measure.

Crowdsourcing lidar

Lidar (Light Detection And Ranging) refers to the depth-sensing, bounced laser-mapping technology that allows many self-driving cars to perceive the world around them. As it happens, those last four words — “the world around them” — is the bit that the researchers behind this project want to change. To give drivers something akin to X-ray vision that allow them to spot obstacles hidden from view — such as the motorcyclist momentarily obscured behind a vehicle — they want to build a massive crowdsourced map of lidar data gathered from all road users.

University of Cambridge

For an analogy of what this might look like, think of that scene from Christopher Nolan’s 2008 movie The Dark Knight in which Batman hacks every cell phone in Gotham City and converts them into a high-frequency generator, stitching together all the location data to build a three-dimensional schematic of the city, from buildings to people. As Lucius Fox, the perturbed Wayne Enterprises boss, says, “you took my sonar concept and applied it to every phone in the city. Half the city feeding you SONAR; you can image all of Gotham.”

The idea of car-to-car communication for collaborative purposes isn’t exactly science fiction. Starting with Waze, many mapping apps have used the driving data of different users to build up a pretty detailed picture of what is happening on the road in terms of the free flow of traffic. Tesla, meanwhile, collects large amounts of road data from vehicle owners via its Full Self-Driving beta test fleet. In 2017, Tesla asked vehicle owners if they were willing to provide videos collected using their cars’ onboard Autopilot cameras. This data, while collected by individual vehicles, is combined to make the overall fleet smarter and better able to deal with obstacles.

What this latest LiDAR project adds to that is the gathering of 360-degree point cloud data that can be aggregated to give every road user a clear view of their surroundings.

As Jana Skirnewskaja, a researcher on the team, told Digital Trends, this is still relatively early times for the project. So far, the team has carried out a proof of concept scanning Malet Street, a busy street in London, using multiple lidar scanners in various positions. This data was then used to build up a 3D model.

3D Model of Malet St, Central London, Based on LiDAR Data

“We scanned Malet Street from 10 different positions using 10 different data scanners,” Skirnewskaja told Digital Trends. “This allows us to fully re-create the street how it is at that moment, so any objects — hidden or not — will be [represented in] the point cloud. This allows us to erase objects that we don’t want to see, and choose the objects which are hidden … and project them.”

Beaming the information into drivers’ eyes

As it happens, this is only one half of the project. The other, equally impressive, bit involves projecting this information directly into the eye of the driver in ultra-high definition. This in-car technology, Skirnewskaja believes, could be a valuable alternative to 2D windscreen AR projection, as well as to burgeoning AR tech like augmented reality contact lenses.

“What our studies have shown is that it [causes] no harm at all to the pupil, to the human eye,” she said. “It can project, [directly] into the driver’s eye, any object. We can also use augmented reality to layer objects so that we project different objects, like road obstacles or signs or people or trees, at different sizes [to indicate] distances. The further away an object is, the smaller it will be. That can be realized.”

University of Cambridge

It means that, as a driver sits behind the wheel, they could have overlaid information appear superimposed on the real world. “[Our work] has shown that we can already project in-eye 3D augmented reality objects on the road, and that these are properly aligned and not distracting the driver,” Skirnewskaja said.

She said that, initially, this will likely be fixed information, such as highlighting permanent obstacles that have caused other drivers problems. But, in the long term, it could be possible to track dynamic objects as well. In addition to gathering lidar data from other vehicles, Skirnewskaja said that cities could install lidar sensors along the sides of roads, similar to the way CCTV cameras are used today.

“We hope that it can be expanded further so that we can connect every car and project this information of road obstacles in real time,” she explained.

The team aims to work with established automotive companies as part of the project. She suggested that this includes Jaguar Land Rover and VW. At present, the researchers are working to miniaturize the optical components they used in their experimental holographic setup so that it can be fitted into a car. After this, they plan to carry out vehicle tests on public roads in the city of Cambridge.

There’s no word on when this technology ultimately goes live, but, provided it works as well as described, it’ll certainly be worth waiting for.

A paper describing the work was recently published in the journal Optics Express.

Editors' Recommendations

Luke Dormehl
I'm a UK-based tech writer covering Cool Tech at Digital Trends. I've also written for Fast Company, Wired, the Guardian…
These new NASA EVs will drive astronauts part way to the moon (sort of)
NASA's new crew transportation electric vehicles.

Three specially designed, fully electric, environmentally friendly crew transportation vehicles for Artemis missions arrived at NASA’s Kennedy Space Center in Florida this week. The zero-emission vehicles, which will carry astronauts to Launch Complex 39B for Artemis missions, were delivered by Canoo Technologies of Torrance, California. NASA/Isaac Watson

NASA has shown off a trio of new all-electric vehicles that will shuttle the next generation of lunar astronauts to the launchpad at the Kennedy Space Center.

Read more
5 upcoming EVs I’m excited for, from luxury SUVs to budget champions
Lotus Eletre

Almost every major automaker has released an EV by now -- or plans to soon -- and makers like Ford and Kia already have a variety to choose from. But if you haven't found one that's right for you yet, hang tight. There are dozens of announced electric car models that have yet to come out, and it's clear that the future of EVs is bright.

From longer range to lower prices, the next batch of EVs gives us plenty to get excited about. Here are five upcoming EVs that we can't wait to drive.
Volvo EX30

Read more
Tesla shows off first Cybertruck after two years of delays
The first Cybertruck built at Tesla's Giga Texas facility.

The first Cybertruck built at Tesla's Giga Texas facility. Tesla

Tesla has shown off the first Cybertruck to roll off the production line at its new Gigafactory plant in Austin, Texas.

Read more