You don’t normally see tech titans like Nvidia and Apple pair up, but the two companies announced at this week’s Nvidia GTC 2024 that they are coming together around the Vision Pro. Nvidia is bringing its Omniverse Cloud platform to Apple’s headset, allowing users to interact with objects and design directly through the Vision Pro.
The basis of support is a set of Omniverse Cloud APIs that can stream Omniverse assets to Apple’s headset. Omniverse isn’t running on the Vision Pro itself. Instead, designers can stream scenes made with the Universal Scene Description (OpenUSD) in Omniverse to the Vision Pro, and interact with the 3D objects natively.
Nissian demoed this capability in a demo video. Through the Vision Pro, the user is able to swap out paint colors, adjust the trim, and even go inside the car using spatial awareness thanks to the Vision Pro.
It’s sure to make a splash in the enterprise sector, but there are some consumer implications here. Nvidia is essentially showing that it can stream interactable 3D applications to the Vision Pro. This is enabled by Nvidia’s Graphics Delivery Network (GDN), which is already being used to stream 3D applications from the cloud. The fact that it can work on the Vision Pro is a big deal.
The linchpin for this are the Omniverse Cloud APIs. Also at GTC, Nvidia revealed five new APIs centered around Omniverse Universal Scene Description (OpenUSD) loud that can be used individually or collectively:
- USD Render: support for ray-traced renders of OpenUSD data
- USD Write: support for modifying OpenUSD data
- USD Query: support for interactive scenes
- USD Notify: support for tracking USD changes
- Omniverse Channel: allows users to connect tools and projects across scenes
Right now, Omniverse Cloud on the Vision Pro is focused around enterprise applications, just as Apple’s headset itself is. This is still a critical foundation for streaming interactable 3D applications to Apple’s headset in the future. Even with how powerful the Vision Pro is, it’s not enough to handle aspects like ray tracing in highly detailed 3D scenes. Being able to stream these scenes at the same quality could line up some exciting apps in the future.