Skip to main content

This incredible brain-reading headset aims to make mice and keyboards obsolete

Image used with permission by copyright holder

Conor Russomanno is working on some pretty lofty stuff: He’s building a headset that will be able to noninvasively read people’s thoughts and use them to control the computer interfaces of tomorrow. But right now, his big worry is whether or not he chose the right name for his startup.

“You know,” said Russomanno, the co-founder and CEO of a brain-computer interface startup called OpenBCI, “sometimes I wish that we had named our company OpenMCI — like, mind computer interface or something like that.”

Recommended Videos

Russomanno isn’t the first founder — and won’t be the last — to experience momentary pangs of regret over naming his company. But, in his case, that regret has nothing to do with the name potentially failing to resonate with focus groups, or infringing on a trademark, or any of the other everyday reasons a founder might have second thoughts.

In fact, it’s a very tech industry remix of a classic philosophical conundrum: The difference between the brain and the mind. And it may just be the future of computing as we know it.

Building a brain mind-computer interface

Let’s back up a little. The first time Russomanno got seriously interested in the brain, he had just suffered a traumatic brain injury. As a college football player, he had suffered concussions before. However, the one he suffered in 2010 while playing rugby for Columbia University’s club team was different. “I was having trouble reading and studying,” Russomanno told Digital Trends. “I really started to ponder the difference between brain and mind. If you damage the ‘hardware,’ you can feel it in the ‘software.’ I went to a number of psychologists and neurologists, and they all told me I was fine. [But I didn’t] feel fine.”

Russomanno recovered from the concussion, but his interest in the brain didn’t waver. A year later, he found himself in graduate school, studying in the Design and Technology MFA program at Parsons School of Design. Russomanno was asked to build a project for his physical computing class. Looking around, he found an online tutorial that laid out exactly how to hack brain waves out of an electroencephalography (EEG) toy and into open source software. “That was the beginning of my pursuit into BCI,” he said. “Haven’t looked back since.”

OpenBCI, a startup based in Brooklyn, New York, burst onto the scene in 2015 with a couple of Kickstarter projects that aimed to build brain-computer interface projects for researchers on a budget. Between them, they raised a shade under $400,000 and launched the company. Now OpenBCI is back with its most ambitious project to date: A virtual reality- and augmented reality-compatible, sensor-studded headset called Galea, announced this month.

“If you’re trying to know exactly the way someone’s physiology or brain or mind is changing in response to stimuli, you have to make sure that all that data is very, very tightly time-locked to the stimuli itself.”

Galea, which will initially ship sometime in 2021, is one of a growing number of portable EEG headsets that monitor electrical activity on the scalp and relay this information to a computer.

The idea of using EEG inputs as a way of directly interfacing with a computer is not a new idea. In the early 1970s, Jacques Vidal, a professor at the Brain Research Institute at the University of California, Los Angeles (UCLA), coined the phrase “brain-computer interface” to describe this notion. “Can these observable electrical brain signals be put to work as carriers of information in man-computer communication or for the purpose of controlling such external apparatus as prosthetic devices or spaceships?” Vidal pondered in a 1973 research paper. “Even on the sole basis of the present states of the art of computer science and neurophysiology, one may suggest that such a feat is potentially around the corner.”

Things have taken considerably longer than Vidal might have initially guessed. But EEG is finally starting to live up to its potential. Even in the past several years, the technology has become more portable and effective. However, the promise of Galea is about more than just EEG. The headset will reportedly include multiple sensors — not just electroencephalogram, but also electrooculography (EOG), electromyography (EMG), electrodermal activity (EDA), and photoplethysmography (PPG). This means that it will gather data from not just the brain, but also the wearer’s eyes, their heart, their skin, and their muscles, making it possible to “objectively measure” a range of internal states via the body’s biological responses to stimuli.

According to OpenBCI, this should allow Galea to accurately quantify engagement metrics including happiness, anxiety, depression, attention span, interest level, and more — all in real time.

Chris So / Getty

“The way the neuroscience community currently do what’s called multimodal sensing — or I like to call sensor fusion — is that they buy a number of different products from different third-party developers [and] then have to stitch this data together in software,” Russomanno said. “This poses problems because time-locking is very, very important. If you’re trying to know exactly the way someone’s physiology or brain or mind is changing in response to stimuli, you have to make sure that all that data is very, very tightly time-locked to the stimuli itself. The way that people do this now is they have different drivers, different software, and different hardware setups. It’s a very tedious process right now for neuroscientists and [research and development] developers.”

Society of mind readers

This idea of combining data from different sources is where the question of brain-versus-mind comes into play. At its loosest, the difference between the brain and the mind is, as Russomanno pointed out, the difference between hardware and software. The mind is undoubtedly associated with the brain, but it is not necessarily the same thing. The brain is a physical organ, while the mind is an intangible, hypothetical concept that relates to a person’s understanding of the world, consciousness, and thought processes.

Dualism posits that our minds are more than simply our brains. This is a spiritual concept, but a version of it applies here. If you’re trying to measure someone’s thought processes, you could do better than simply limiting yourself to the comparatively low spatial resolution of EEG brain analysis. You know how they say the eyes are the windows to the soul? Well, maybe other bodily responses are as well. Build enough windows in a house and you should be able to see what’s happening inside.

“What we really care about is human emotion, human intent. We care about internal states of mind and the way that environments and activities change that.”

Valuable indicators of the mind could be found by, for instance, using image-based eye tracking to infer information about intent, interest, and arousal that could then be cross-referenced with EEG data. These combined data sets have significantly greater predictive value than just one on its own. As Russomanno bluntly puts it: “We don’t really care about the brain; we just know that the brain is the nucleus of the nervous system and the nucleus of the mind. What we really care about is human emotion, human intent. We care about internal states of mind and the way that environments and activities change that.”

What Galea promises to provide is a time-locked array of sensor inputs integrated at the hardware level. By combining these different sensor readings, Russomanno believes it will be possible to more accurately create a mind-computer interface.

In a way, you could think of these sensors as agents in line with what the late A.I. researcher Marvin Minsky described in his 1986 book The Society of Mind. Minsky suggested that human intelligence is the aggregate result of interactions between a large number of simple mechanisms that are not, in and of themselves, especially intelligent. Minsky gives the example of the various agents that go into drinking a cup of tea. You have, he suggests, a grasping agent focused on keeping hold of the cup. You have a balancing agent focused on keeping the tea from spilling. You have a thirst agent that’s trying to gain nourishment by making you drink the tea. And you have an assortment of moving agents responsible for conveying the cup to your lips. Together, they combine to create an example of intelligent behavior — even if we might not think of tea-drinking as an especially intelligent task. A similar society could be achieved by combining different sensor inputs for mind-reading purposes.

Think of the possibilities

The big question, of course, is what all of this will be used for. Russomanno is clear that what OpenBCI is building is a platform, not a finished product. Galea will ship with a connected VR headset and a Unity-based SDK that will provide a few basic examples of mind-controlled VR in practice (imagine using muscle data from your face to drive a car or moving virtual objects through a type of VR telekinesis). But the real uses will be developed by the people who get access to the tool.

Image used with permission by copyright holder

“What we’re doing with Galea is trying to throw in all the bells and whistles we can,” he said. “We want to provide a playground, a developer playground, to allow our customers and the developer community at large to start making discoveries about what sensors are useful for what types of applications.”

One thing he has no doubt of is how transformative this could all be. On the shallow end, imagine games that sense your physiological responses and adjust their gameplay accordingly. On a more profound level, think of an app or operating system that knows the best way for you to work for maximum productivity and changes itself based on that.

“Over the last 20 years, we’ve truly entered an attention economy — for better or worse,” Russomanno said. “What the biggest companies in the world care about is understanding user preferences, user engagement. What lights someone up? What makes somebody want to click and buy something? What makes somebody want to stay in an application or experience as opposed to logging into a different tool or different application?”

Tools like Galea aim to make this kind of information more discoverable and actionable than ever. And that’s just the beginning. “It’s going to be [a game-changer] when technology truly becomes empathetic, and evolves from this one-size-fits-all idea of ‘let me design the most attractive interface to the general public,’” said Russomanno. “What we’re going to see is operating systems that are [continuously] tweaking their settings and interfaces and applications to the preferences of the individual. In my mind, it’s undeniable that this is going to happen.”

If it does, it won’t just be Russomanno’s mind that this revolution takes place in. It will be all of ours.

Luke Dormehl
Former Digital Trends Contributor
I'm a UK-based tech writer covering Cool Tech at Digital Trends. I've also written for Fast Company, Wired, the Guardian…
This Lenovo ThinkPad is almost $1,800 off today!
A press photo of the ThinkPad X1 Carbon Gen 11.

One of the best laptops for a busy computer-heavy workplace is the Lenovo ThinkPad. For years, this tried and true laptop and 2-in-1 has delivered a fast and reliable Windows experience to many a 9 to 5 go-getter. Processor speed and power evolve year over year, and new features are added to these laptops all the time. This also means you’ll be able to find discounts on older machines, which is precisely what we came across while scouring through Lenovo ThinkPad deals:

Right now, as part of Lenovo’s doorbuster sale, you’ll save $1,800 on the purchase of a brand-new Lenovo ThinkPad X1 Carbon Gen 11 when you order through Lenovo.

Read more
Runway brings precise camera controls to AI videos
Gen-3 alpha advanced camera controls

Content creators will have more control over the look and feel of their AI-generated videos thanks to a new feature set coming to Runway's Gen-3 Alpha model.

Advanced Camera Control is rolling out on Gen-3 Alpha Turbo starting today, the company announced via a post on X (formerly Twitter).

Read more
Score the Dell XPS 15 for less than $1,000 during this sale
Dell XPS 15 9520 front view showing display and keyboard deck.

If you’ve been looking for laptop deals but feel disappointed with the results of your research, we know the pain. Searching for a new PC can take months, especially if you’ve got the time and energy to vet through numerous brands and models. Fortunately, there are a few tried and true PC names, one of which happens to be Dell. We see Dell laptop deals pretty regularly, but this one stopped us in our tracks:

Right now, when you order the Dell XPS 15 Laptop through the manufacturer, you’ll save $300. At full price, this model sells for $1,300.

Read more