In Star Trek’s Holodeck, lifelike simulations interacted with real people. With the Sulon Cortex, it’s HD graphics that wrap around you, with objects inside that you can interact with.
You don’t have to strap on a headset to recognize that virtual reality is an exciting emerging technology. What’s out there is still mostly conceptual, offering a peek into the possibilities of what virtual interaction can do. So while we’re conceiving, how about the idea of merging the virtual world with the real one?
Toronto-based startup Sulon Technologies is looking to do just that with its Cortex VR headset, which can detect the physical constructs of your surroundings while transporting you into a virtual one.
And I got to take it for a test drive.
It can add virtual objects to real space, objects that are aware of the size and scale around them.
In effect, the Cortex augments virtual reality into a real space, which is a key difference from other VR headsets like the Oculus Rift, where real objects placed in a virtual world have no contextual awareness. Sulon does this using what it calls a “spatial scanner” to quickly map the environment around you — in 360 degrees and in real-time, with built-in sensors that constantly assess the physical dimensions of the space you’re inhabiting.
With this understanding of the meatspace our bodies live in, it can add virtual objects that are aware of the size and scale of stuff around them. For example, if you’re in a room, and there are virtual zombies trying to get to you as you close a real-life door behind you, they will be outside the door trying to get through. Or think of a large animal like a giraffe, ducking down if the actual ceiling is too low.
This isn’t unlike the Holodeck in Star Trek, where lifelike holographic simulations could interact with real people. Except, in this case, it’s HD graphics essentially wrapped all around you, including objects that you can interact with.
“It’s hard to quantify it in terms of a screen size, but it’s not about field of view, it’s about how immersed you feel,” Dhan Balachand, Sulon’s CEO and founder, told Digital Trends. “There’s a need — not a demand — for the display industry to move faster and to keep doubling and quadrupling resolutions, but it’s less about the resolution itself, and more about how crisp the image is and how it engulfs your senses.”
Balachand’s inspiration for the Cortex was the concept of providing users with information about scalable objects that are otherwise hard to get in hand. One of the examples he cites is a diesel engine that can be put together or separated into its various parts. The device is able to render experiences like this locally through the built-in CPU and graphics processor.
He considers the technology to be “huge” in the scope of what it could truly become, envisioning scenarios where it could help. The most important of them, he believes, is situation awareness.
“Firefighters going into a blazing building fitted with the Cortex can have it reconstruct the environment, augment their path in and out of the building, and even triangulate where someone is in a room under thick smoke,” he says. “It dynamically maps as it goes through, seeing so far ahead, how everything is built and where the sound of someone screaming for help is coming from. Imagine feeding that information back to another person wearing a Cortex who can augment the map in 3D on a car hood, knowing where firefighters are in the building in real-time.”
Wearing the Cortex
I didn’t get the chance to save anyone from a virtual burning building, but I was treated to two demos of what it’s like to step into this hybrid world.
Sulon’s current headquarters are in a suburb just north of Toronto, though a move downtown is coming soon, says Balachand. The semi-finished prototype I tried on felt comfortable, despite its notable size (Balachand says it will be a tad smaller when finalized). It felt substantial without feeling too heavy or clunky, but it’s hard to say how long one can wear it without feeling a little dazed.
The room outfitted for testing the headset in their offices wasn’t large, yet it was still spacious enough to move around several paces in any direction. I got to stand in on a cinematic battle set in a space that looked like the entrance to a subway station in the exact dimensions of the room. I could feel the physical walls if I moved close enough to touch them (an audible beep warned me when I got close), except they looked like ceramic tiles. The characters fighting paid no attention to me because they weren’t programmed to, but I could literally position myself at any angle in the room to watch the action. It was no more than a few minutes long, but it was a fascinating viewpoint.
“It dynamically maps as it goes through, seeing so far ahead and how everything is built, where the sound of someone screaming for help is coming from.”
The second demo was interactive: I was required to walk to certain points to trigger responses in a short game where I had to defeat a big Hydra emerging from a pit of lava. Despite hiccups getting it to work, I eventually managed to walk and shoot bursts of plasma and ice from my virtual hands at the beast. It would rise up right in front, and then to my left, and then over to the right. Clearly, the game itself needed some work to reduce latency and improve the graphics, but it was still immersive and fun to try out.
Where is content this immersive going to come from? Movies and games are the main ones, while 360-degree photos and video could just as easily be done too. Balachand says computer-generated images (CGI) are much easier to design for walking around a scene, whereas doing it with live action sequences “will take a lot of work and innovation.”
Conclusion
This highlights some of the other challenges Sulon still faces with its technology. Putting together an engine this way is intriguing for any engineer or car enthusiast, but there is no sense of touch for the individual parts. Sense of smell is another that seems like a bridge too far for VR, in general. The company is looking to simulate touch but couldn’t reveal the “top secret” ways in which it aims to actually achieve that. And finally, it was hard to feel entirely secure walking around a barren room without having a true sense of the space around me once I started moving.
The device’s form factor is also undergoing some tweaks before it’s made available to developers this fall for $500. Indeed, the developer community is the first target, with businesses and enterprises to follow until it’s finally ready for consumers when there’s sufficient content available. Like other VR headsets, there is an HDMI input for “sit-down experiences” that won’t require moving around.
“The goal is to get it to mass consumer adoption and every person’s living space because it’s a tool and a platform that can solve a lot of real-world problems,” Balachand says. “And we’re thinking everything from entertainment to engineering. Want to buy a car? Drop different ones on your driveway, change the color, and then share it with friends or family. You’ve got real estate where you can see a new kitchen to scale or even retail spaces, which we’ll have something on later this year.”
Highs
- Blends virtual and augmented reality together
- Real-world physical objects can stand in for virtual ones
- Virtual objects can be scaled to their real-life sizes
- Gaming can truly be immersive and interactive
Lows
- Bulky headset; unfinished form factor
- Live action scenes will present challenges
- Long-term prospects depend on content availability