185
submitted 1 year ago by BrikoX@lemmy.zip to c/gaming@lemmy.zip
you are viewing a single comment's thread
view the rest of the comments
[-] huginn@feddit.it 2 points 1 year ago

Here's a recent story about hallucinations: https://www.cnn.com/2023/08/29/tech/ai-chatbot-hallucinations/index.html

The tldr is nobody has solved it and it might not be solvable.

Which when you think of the structure being LLMs... that makes sense. They're statistical models. They don't have a grounding in any sort of truth. If the input hits the right channels it will output something undefined.

The Microsoft guy tries to spin this as "creativity!" but creativity requires intent. This is more like a random number generator outputting your tarot and you really buying into it.

this post was submitted on 03 Sep 2023
185 points (98.9% liked)

Gaming

2414 readers
192 users here now

The Lemmy.zip Gaming Community

For news, discussions and memes!


Community Rules

This community follows the Lemmy.zip Instance rules, with the inclusion of the following rule:

You can see Lemmy.zip's rules by going to our Code of Conduct.

What to Expect in Our Code of Conduct:


If you enjoy reading legal stuff, you can check it all out at legal.lemmy.zip.


founded 1 year ago
MODERATORS