195
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
this post was submitted on 05 Jul 2024
195 points (94.1% liked)
Gaming
19998 readers
117 users here now
Sub for any gaming related content!
Rules:
- 1: No spam or advertising. This basically means no linking to your own content on blogs, YouTube, Twitch, etc.
- 2: No bigotry or gatekeeping. This should be obvious, but neither of those things will be tolerated. This goes for linked content too; if the site has some heavy "anti-woke" energy, you probably shouldn't be posting it here.
- 3: No untagged game spoilers. If the game was recently released or not released at all yet, use the Spoiler tag (the little ⚠️ button) in the body text, and avoid typing spoilers in the title. It should also be avoided to openly talk about major story spoilers, even in old games.
founded 5 years ago
MODERATORS
Not gonna happen. Not really.
So far research suggests the guardrail and hallucination problems are unsolvable, and we are seeing diminishing returns from increasing the complexity of these systems.
Hence devs will never have the necessary control required to author an actual narrative. NPCs will end up talking about mechanics that don't exist, or saying things that contradict an overrall narrative.
Even with actual people, if you just throw them in a room and have the improv a world into existence, it never ends up quite as good as a properly authored narrative.
And LLMs are nowhere near achieving the level of internal consistency required for something like the worlds of Elden Ring or Mass Effect.
Baldur's Gate 3 contains truly staggering amounts of writing, multiple times that of classical literary works. The hallucination problem means that if all that were AI generated, small parts of it might pass inspection, but trying to immerse yourself in it as a fictional world would have you noticing immersion breaking continuity errors left and right.