Skip to main content

Driverless cars? Mother Nature may have a few things to say about that

2015 Honda CR-V
Image used with permission by copyright holder
By 2:00 AM, I had been traveling for over 12 hours, hopping between planes and strange airports. As I staggered out to the parking lot to retrieve my press demonstrator, I was ready for my day to be done. Despite my yearning for rest, I faced a two-hour drive through a dark and stormy night over a stretch of rural Illinois freeway that hadn’t been improved since Abraham Lincoln was Commander in Chief.

Fortunately, I had the 2015 Honda CR-V waiting for me, the first accessible car to include meaningful piloted driving technology.

Though packed with piloted-driving tech, the CR-V is not perfect. To my chagrin, this point became quickly clear. On that dark, rain-soaked road, the CR-V’s autonomous tech found and locked onto an old and faded freeway lane marking and attempted to pilot itself across the current flow of traffic.

It was this frightening mistake that inspired in me the strong realization that, when it comes to dealing with Mother Nature, autonomous driving tech still has a long way to go. Current self-driving technology, not least what is found in the CR-V, is certainly impressive. When it comes to rain, snow, and nighttime, however, modern optically based systems are literally in the dark.

State of play

The 2015 Honda CR-V is a great example of where self-driving cars are going, not to mention an impressive piece of kit in its own right. Accordingly, it makes a perfect example with which to judge the current state of piloted-driving proficiency. The core of its self-driving capabilities is the combination of existing systems, including Lane Keeping Assist and Adaptive Cruise Control (ACC).

In most cars, ACC is capable of keeping pace with the flow of traffic on the freeway by using radar to track speed and distance of the vehicle ahead. In conjunction, Lane Keeping Assist uses cameras to find and identify lane markings and warn the driver if he or she strays out of that lane. Some vehicles, like the current Mercedes-Benz E-Class, can autonomously steer the car back in the lane a little. So, too, can the CR-V, as long as the driver keeps his or her hands on the wheel.

How does it do this? The CR-V uses cameras paired with a powerful pattern recognition program to find lane markings. Then this information is sent to the electric power steering, which turns the wheels accordingly. The technology is, in essence, fairly simple. The complicated bit is the creation of a program that can consistently and accurately recognize lane markings in all of their various forms and states of disrepair.

The problem

Admittedly, That sounds great. When I was using the system on that dark and rainy freeway, though, things fell apart. The system constantly cut in and out as it found and lost lane markings. On one notable occasion, it even tried to drag me into the next lane over, as it followed badly covered lane markings from an earlier point in construction.

Why did this happen? It’s not because Honda’s system is defective, or even bad, but because of inherent problems with the sensors. Cameras — like the human eye — work great in daylight and good visibility. When light levels drop and rain increases, their ability to capably capture a full images drops significantly.

2015 Honda CR-V
Image used with permission by copyright holder

A rain soaked camera lens, for example, washes out contrast between lane markings and pavement for the camera sensor — or simply obscure them completely. Anyone who has ever tried to take a photo in a darkroom should understand the problem. The reduced input of information, in turn, makes the job of the pattern recognition software more difficult. Computer programs, as a rule, can’t intuit results from incomplete information, something that our brains are actually great at.

As an example, when the CR-V tried to follow an old lane marking, it was obvious to me that we were about to move diagonally across the freeway, because I was capable of taking in the whole picture. The computer program was doggedly doing the only thing it could, however, by follow the markings it recognized.

What I encountered is only the tip of the iceberg … literally. Snow and ice that obscure lane markings make optical systems essentially useless as they cover up everything from lane markings to street signs. So, if this is the case, why even use cameras?

But wait!

Our entire driving infrastructure is based off of human vision. Everything from traffic signs to brake lights are designed with the human eye and brain in mind. For that reason, any automated system that is going to exist in this infrastructure has to be able to use these same symbols. And, right now, the only cost effective way of doing that is by using cameras.

Truly autonomous cars get around some of the problems discussed by augmenting cameras with other more complex sensors, especially LIDAR. The term is a portmanteau of light and radar. This gives a pretty good sense of how it works. On cars, LIDAR systems use a spinning array of lasers that reflect off the world around it, the system utilizes sensors to detect the reflected light, thereby creating a picture of the world around the car.

This system has the advantage of working in far more conditions than an optical camera, though lasers can still be fooled by rain and snow bouncing signals back. But the big problem is that these systems are expensive and bulky, as the massive sensor package on top of Google’s autonomous cars can attest. In fact that system by itself costs around $8,000 — far more than automakers would be willing to spend at this point.

For autonomous cars to get around this challenge, either cameras and their attached computers need to get much better at recognizing the world around them in a broader array of conditions, LIDAR or similar systems need to get much cheaper, or the way we mark our roads needs to start taking robots into account.

Conclusion

To really make the autonomous driving work sensors need to be complemented by Car-2-Car and Car-2-X communication (also referred to as vehicle to vehicle (V2V) and vehicle to infrastructure (V2I)). This will not only allow cars to communicate with each other to fill in sensor gaps, but also receive messages from the road infrastructure itself. The early stages of this technology are being researched and prototyped, but full implementation is a ways off.

While the world waits for cars to start talking to each other, and systems like LIDAR to improve, and, more importantly, get cheaper, optical systems are still going to be the main option. Companies recognize this and are working to get the most out of camera based systems and the software that runs them. Look ahead to our next Road Rave to see what some of those innovations might be.

In the meantime, optically based systems can still do an impressive amount. The Honda CR-V is already able to do what was nearly unimaginable a decade ago; drive itself on the highway. The dream of fully autonomous cars just might not be as close as some companies believe.

Peter Braun
Former Digital Trends Contributor
Peter is a freelance contributor to Digital Trends and almost a lawyer. He has loved thinking, writing and talking about cars…
Tesla and Elon Musk sued over use of AI image at Cybercab event
tesla and spacex CEO elon musk stylized image

Tesla’s recent We, Robot presentation has run into trouble, with one of the production companies behind Blade Runner 2049 suing Tesla and its CEO, Elon Musk, for alleged copyright infringement.

Tesla used the glitzy October 10 event to unveil its Cybercab and Robovan, and also to showcase the latest version of its Optimus humanoid robot.

Read more
Qualcomm wants to power your next car with the Snapdragon Cockpit and Ride Elite platforms
Qualcomm Snapdragon Cockpit Elite and Ride Elite automotive platforms

It’s been a big year for Qualcomm. Alongside its massive launch into laptop chips through the Snapdragon X Elite series, Qualcomm is now entering the automotive space. The company has announced the new Qualcomm Snapdragon Cockpit Elite and Snapdragon Ride Elite platforms at its annual Snapdragon Summit, which it flew me out to attend.

The two platforms are designed for different purposes, and can be used togetheror separately. The Snapdragon Cockpit Elite is built for in-vehicle infotainment systems and services, while the Snapdragon Ride Elite is built to power autonomous vehicle systems, including all the cameras and sensors that go into those systems.

Read more
Scout Traveler and Scout Terra forge a new path for EVs
Scout Traveler and Scout Terra.

Electric vehicles are inseparable from newness, whether it’s new tech, new designs, or new companies like Rivian, Lucid, and Tesla. But the Volkswagen Group’s new EV-only brand also relies heavily on the past.

Unveiled Thursday, the Scout Traveler electric SUV and Scout Terra electric pickup truck are modern interpretations of the classic International Harvester Scout. Manufactured from 1961 to 1980, the original Scout helped popularize the idea of the rugged, off-road-capable utility vehicle, setting the stage for modern SUVs.

Read more