Nvidia did what we all knew was coming — it made an AI-driven game demo. In Convert Protocol, you play as a detective trying to track down a particular subject at a high-end hotel. The promise is sleuthing through conversations with non-playable characters (NPCs) to get what you need. Except in this demo, you use your microphone and voice to ask questions instead of choosing from a list of preset options.
I saw the demo with a few other journalists in a small private showing. As the demo fired up and Nvidia’s Seth Schneider, senior product manager for ACE, took the reigns, I was filled with excitement. We could ask anything; we could do anything. This is the dream for this type of detective game. You don’t get to play the role of a detective with a preset list of dialogue options. You get to ask what you want, when you want.
Schneider even put out a call for questions. We could ask the AI-driven door greeter anything. The chat was filled with a few novel ideas — is PC or console gaming better, what’s your favorite RTX GPU, that kind of thing. Schneider spoke the questions into the microphone, but sparks didn’t fly.
These AI-driven NPCs, set in their ways with an extensive network of guardrails to keep them on topic, would ignore anything unrelated to the game. The door greeter could greet you politely and maybe make a few comments about needing a break. Inside the hotel, the main setting of this detective plot, you could ask the receptionist about room numbers or how the hotel works. There’s even an executive sitting nearby who knows a bit about your subject.
All of these conversations are real conversations. They happen with a microphone, and you use your voice to talk how you want to. In a blink, the NPCs can respond through the AI model. It’s a different way of playing a dialogue-driven game. I just don’t know if it’s a better way.
The NPCs stayed on topic, never straying too far away and risking devolving into an AI mess. That’s Nvidia’s tech at work, but it doesn’t really change the gameplay experience. What I realized through the demo is that this tech simply opens up more dialogue trees. It gives you more possibilities to run out those dialogue options you always skip over in a dense RPG. It’s interesting, but it doesn’t make playing the game better.
Asking questions about console or PC gaming, or about a particular NPC’s favorite RTX GPU, it was clear that the AI had no concept of what Schneider was talking about. They could provide an answer that made sense in the context of a question, but not one that was substantially different than an answer you could get with a pre-programmed list of dialogue options.
It’s easy to look at Nvidia’s demo and get swept up in the possibilities. There are a lot of possibilities here, and I’m excited to see how developers leverage this tech in real games. I just don’t think it’ll be something that will immediately change how games are designed and played.
There’s something to be said about a preset list of dialogue options. NPCs often have interesting things to say, and with endless possibilities, there isn’t always a guarantee that you ask the right thing. It’s hard to imagine a developer being content with hiding critical details behind an AI that might not reveal the right information. I suppose you could use the AI to flesh out a world with more NPCs that aren’t critical to the game — but do we really want more endless dialogue trees in our games?