100
submitted 2 months ago by ArcticDagger@feddit.dk to c/science@lemmy.world
you are viewing a single comment's thread
view the rest of the comments
[-] mods_mum@lemmy.today 13 points 2 months ago

I mean LLMs can and will produce completely nonsensical outputs. It's less of AI and more like a bad text prediction

[-] davidgro@lemmy.world 19 points 2 months ago

Yeah, but the point of the post is to highlight bias - and if there's one thing an LLM has, it's bias. I mean that literally: considering their probabilistic nature, it could be said that the only thing an LLM consists of is bias to certain words given other words. (The weights, to oversimplify)

[-] slazer2au@lemmy.world 13 points 2 months ago

Regurgitation machine prone to hallucinations is my go-to for explaining what LLMs really are.

[-] Steve@communick.news 5 points 2 months ago* (last edited 2 months ago)

I heard them described as bullshiting machines. The have no concept of, or regard for truth or lies, and just spout whatever sounds good. Much of the time it's true. Too often it's not. Sometimes it's hard to tell the difference.

[-] ReadyUser31@lemmy.world 1 points 2 months ago

will they still be like that in ten years?

this post was submitted on 30 Aug 2024
100 points (88.5% liked)

science

14595 readers
118 users here now

A community to post scientific articles, news, and civil discussion.

rule #1: be kind

<--- rules currently under construction, see current pinned post.

2024-11-11

founded 1 year ago
MODERATORS