Skip to main content

Eyesight Technologies’ in-car A.I. can tell when drivers smoke or use phones

Eyesight Technologies action detection in DMS: Detecting smoking, holding phone and seatbelt wearing

According to recent European Union proposals, all new cars on the road as of mid-2022 must be equipped with special driver monitoring systems that are capable of telling whether the person behind the wheel is distracted. Calls for similar guidelines are being heard all over the world. With that in mind, Israel-based Eyesight Technologies has developed new in-car monitoring technology for spotting when drivers are using cell phones or smoking while driving. These major potential causes of car accidents stop drivers from focusing fully on the road in front of them.

The latest feature builds on the existing in-car monitoring system Eyesight Technologies has developed. This system uses an infrared sensor to detect driver features such as head pose, blink rate, eye openness, and gaze. By constantly measuring these factors during the course of a journey, Eyesight’s in-cabin A.I. can work out if a driver is starting to feel asleep or is distracted in some other way.

Eyesight Technologies

“The new features we are introducing builds on our previous capabilities to add the detection of actions: Speaking on your handset or smoking behind the wheel,” Liat Rostock, vice president of marketing at Eyesight Technologies, told Digital Trends. “This allows car manufacturers and also fleet owners to create a specific response to those actions. For example, if you’re a truck driver hauling fuel, the trucking company can get an urgent alert if you start smoking a cigarette. Another scenario is a parent who could get an alert if their teen is chatting on their handset while driving.”

Eyesight sells its technology to car manufacturers to be pre-installed in new models. According to Rostock, the company has just signed a $15 million deal with a major U.S. car manufacturer. It can also provide its software and a camera to be installed in fleet vehicles. The company also sees potential in the shared car economy by eliminating the presence of smoking in communal cars, including rentals and autonomous taxis, where there are no drivers present to enforce such policies. Eyesight will be demoing its new smoking and phone features at CES 2020 in January of next year.

Luke Dormehl
I'm a UK-based tech writer covering Cool Tech at Digital Trends. I've also written for Fast Company, Wired, the Guardian…
How the USPS uses Nvidia GPUs and A.I. to track missing mail
A United States Postal Service USPS truck driving on a tree-lined street.

The United States Postal Service, or USPS, is relying on artificial intelligence-powered by Nvidia's EGX systems to track more than 100 million pieces of mail a day that goes through its network. The world's busiest postal service system is relying on GPU-accelerated A.I. systems to help solve the challenges of locating lost or missing packages and mail. Essentially, the USPS turned to A.I. to help it locate a "needle in a haystack."

To solve that challenge, USPS engineers created an edge A.I. system of servers that can scan and locate mail. They created algorithms for the system that were trained on 13 Nvidia DGX systems located at USPS data centers. Nvidia's DGX A100 systems, for reference, pack in five petaflops of compute power and cost just under $200,000. It is based on the same Ampere architecture found on Nvidia's consumer GeForce RTX 3000 series GPUs.

Read more
Algorithmic architecture: Should we let A.I. design buildings for us?
Generated Venice cities

Designs iterate over time. Architecture designed and built in 1921 won’t look the same as a building from 1971 or from 2021. Trends change, materials evolve, and issues like sustainability gain importance, among other factors. But what if this evolution wasn’t just about the types of buildings architects design, but was, in fact, key to how they design? That’s the promise of evolutionary algorithms as a design tool.

While designers have long since used tools like Computer Aided Design (CAD) to help conceptualize projects, proponents of generative design want to go several steps further. They want to use algorithms that mimic evolutionary processes inside a computer to help design buildings from the ground up. And, at least when it comes to houses, the results are pretty darn interesting.
Generative design
Celestino Soddu has been working with evolutionary algorithms for longer than most people working today have been using computers. A contemporary Italian architect and designer now in his mid-70s, Soddu became interested in the technology’s potential impact on design back in the days of the Apple II. What interested him was the potential for endlessly riffing on a theme. Or as Soddu, who is also professor of generative design at the Polytechnic University of Milan in Italy, told Digital Trends, he liked the idea of “opening the door to endless variation.”

Read more
Emotion-sensing A.I. is here, and it could be in your next job interview
man speaking into phone

I vividly remember witnessing speech recognition technology in action for the first time. It was in the mid-1990s on a Macintosh computer in my grade school classroom. The science fiction writer Arthur C. Clarke once wrote that “any sufficiently advanced technology is indistinguishable from magic” -- and this was magical all right, seeing spoken words appearing on the screen without anyone having to physically hammer them out on a keyboard.

Jump forward another couple of decades, and now a large (and rapidly growing) number of our devices feature A.I. assistants like Apple’s Siri or Amazon’s Alexa. These tools, built using the latest artificial intelligence technology, aren’t simply able to transcribe words -- they are able to make sense of their contents to carry out actions.

Read more