53
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
this post was submitted on 14 Jan 2024
53 points (93.4% liked)
Games
16637 readers
624 users here now
Video game news oriented community. No NanoUFO is not a bot :)
Posts.
- News oriented content (general reviews, previews or retrospectives allowed).
- Broad discussion posts (preferably not only about a specific game).
- No humor/memes etc..
- No affiliate links
- No advertising.
- No clickbait, editorialized, sensational titles. State the game in question in the title. No all caps.
- No self promotion.
- No duplicate posts, newer post will be deleted unless there is more discussion in one of the posts.
- No politics.
Comments.
- No personal attacks.
- Obey instance rules.
- No low effort comments(one or two words, emoji etc..)
- Please use spoiler tags for spoilers.
My goal is just to have a community where people can go and see what new game news is out for the day and comment on it.
Other communities:
founded 1 year ago
MODERATORS
The problem with this sort of thing is that, for some reason, AI-generated conversation just feels hollow compared to human-written conversation. It's a weird thing, because I honestly can't really articulate why that is, but hearing an NPC talk about some event as written by a human has a heightened feeling of importance compared to hearing an AI-generated text about that same event. Maybe it's because when it's human-written, we subconsciously know that someone cared enough to spend the time to write it specifically, so it must be important in some way.
It'll be interesting to see how this plays out when it's actually implemented in real games. If the AI NPCs were given specific plot points that they were supposed to hit to move the story along, how do they prevent situations where the AI just never gets to it, or where it doesn't have the feeling of importance and the player just glazes over it? If it's just used for random NPCs whose dialog isn't really important to the narrative, how do they avoid it just becoming background noise that we don't pay attention to or care about?
A theory of mine about this problem is that an AI knows what it should do (because of training data) but not if it was effective; as it doesn't have a metric to test if its output is engaging to humans. When an AI generates dialogue it does so by copying and merging many existing snippets of text, but without a clear set of goals in doing so. When a human writes dialogue, they have a specific atmosphere in mind, a set of goals, foreshadowing, the tone shifting throughout the sentences etc. AIs might accidentally do it right from time to time, but more often than not they mess this part up.