I'm optimistic about the genAI stuff, only because the game I'm turning over in my head is based on imperfect information, which I think would work well with generated NPCs. But I'm thinking about it from a personality perspective, not having it generate graphics or voice for the NPC.
But having it generate hundreds of NPCs and making sure that they all interact meaningfully might be pushing the limits of the current technology.
I have been thinking about this for over a week...and honestly, I don't have a good answer. I think experimentation is the only (sort of cheating) answer I have to this.
That, and a reminder that I still have a lot more to learn about the technology!
To be fair, it was a loaded question. I use genAI often, but I've grown very disenchanted with its inability to symbolically or emotionally understand information and its context.
Maybe the emotion is where the limits are for technologies like this.
Symbolic information might get there though, or at least, to a point where it might be usable in a game. Complex symbolism is probably further away, but it may also be something that developers don’t want to get players to wade through.
The limits are that GenAI can only follow the logic of words, not the symbolic, factual, or emotional context of a conversation. I think developers will soon discover that they can't replace crafted dialogue. I can see developers using GenAI to help create conversation trees. But GenAI is not a substitute for hand-crafting conversations. At least, I've not seen anything that has convinced me otherwise.
An aspect for the borderlands mini game that was missing for me in this article and the topic in general is aspects of consent from the player. You mentioned the minigame had to be optional and I agree, but the user should also be informed of the intend of such minigame imho. Personally I’m a big fan of this sort of citizen science. Just playing a bit devils advocate here, I know some people would object to „working“ for free in that manner. Even if it is for science.
That's a good point - if you don't mind, I'm planning to reach out to some researchers and would like to ask them this question. It's a really good consideration if developers choose to include it, even as an option.
I’m not necessarily against the chatbot AI game implementation stuff. I just don’t see how it helps tell a better narrative. I feel like we’re at least a decade away from some developer harnessing it well. Potential is there though
And maybe it's less about telling a narrative (i.e., not relying on the genAI to tell the tale) but to allow the genAI to enrich the experience for the player by providing more relatable characters. I'm hoping it comes quicker than a decade though!
Love the mini game idea. Absolutely loathe the gen AI idea. Happy birthday and have fun camping!
Thank you!
I'm optimistic about the genAI stuff, only because the game I'm turning over in my head is based on imperfect information, which I think would work well with generated NPCs. But I'm thinking about it from a personality perspective, not having it generate graphics or voice for the NPC.
But having it generate hundreds of NPCs and making sure that they all interact meaningfully might be pushing the limits of the current technology.
That might work, though how would you overcome the lack of rational or emotional processes that would link information together?
I have been thinking about this for over a week...and honestly, I don't have a good answer. I think experimentation is the only (sort of cheating) answer I have to this.
That, and a reminder that I still have a lot more to learn about the technology!
To be fair, it was a loaded question. I use genAI often, but I've grown very disenchanted with its inability to symbolically or emotionally understand information and its context.
Maybe the emotion is where the limits are for technologies like this.
Symbolic information might get there though, or at least, to a point where it might be usable in a game. Complex symbolism is probably further away, but it may also be something that developers don’t want to get players to wade through.
The limits are that GenAI can only follow the logic of words, not the symbolic, factual, or emotional context of a conversation. I think developers will soon discover that they can't replace crafted dialogue. I can see developers using GenAI to help create conversation trees. But GenAI is not a substitute for hand-crafting conversations. At least, I've not seen anything that has convinced me otherwise.
An aspect for the borderlands mini game that was missing for me in this article and the topic in general is aspects of consent from the player. You mentioned the minigame had to be optional and I agree, but the user should also be informed of the intend of such minigame imho. Personally I’m a big fan of this sort of citizen science. Just playing a bit devils advocate here, I know some people would object to „working“ for free in that manner. Even if it is for science.
That's a good point - if you don't mind, I'm planning to reach out to some researchers and would like to ask them this question. It's a really good consideration if developers choose to include it, even as an option.
I’m not necessarily against the chatbot AI game implementation stuff. I just don’t see how it helps tell a better narrative. I feel like we’re at least a decade away from some developer harnessing it well. Potential is there though
I agree, the potential is there!
And maybe it's less about telling a narrative (i.e., not relying on the genAI to tell the tale) but to allow the genAI to enrich the experience for the player by providing more relatable characters. I'm hoping it comes quicker than a decade though!