151
(page 2) 50 comments
sorted by: hot top controversial new old
[-] Zaktor@sopuli.xyz 4 points 1 year ago

They're both BS machines and fact generators. It produced bullshit when asked about him because as far as I can tell he's kind of a nobody, not because it's just a stylistic generator. If he asked about a more prominent person likely to exist more significantly within the training corpus, it would likely be largely accurate. The hallucination problem stems from the system needing to produce a result regardless of whether it has a well trained semantic model for the question.

LLMs encode both the style of language and semantic relationships. For "who is Einstein", both paths are well developed and the result is a reasonable response. For "who is Ryan McGreal", the semantic relationships are weak or non-existent, but the stylistic path is undeterred, leading to the confidently plausible bullshit.

load more comments (9 replies)
[-] Gamey@feddit.de 3 points 1 year ago

I don't understand shit like that, they are tools, not totally accurate ones but unless you use Bing they do produce a lot of good stuff if used correctly...

load more comments
view more: ‹ prev next ›
this post was submitted on 12 Sep 2023
151 points (100.0% liked)

Technology

37742 readers
811 users here now

A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.

Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.

Subcommunities on Beehaw:


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 2 years ago
MODERATORS