181
you are viewing a single comment's thread
view the rest of the comments
[-] AtmosphericRiversCuomo@hexbear.net 2 points 11 months ago

These models absolutely encode knowledge in their weights. One would really be showing their lack of understanding about how these systems work to suggest otherwise.

[-] KobaCumTribute@hexbear.net 5 points 11 months ago

Except they don't, definitionally. Some facts get tangled up in them and can consistently be regurgitated, but they fundamentally do not learn or model them. They no more have "knowledge" than image generating models do, even if the image generators can correctly produce specific anime characters with semi-accurate details.

[-] AtmosphericRiversCuomo@hexbear.net 1 points 11 months ago* (last edited 11 months ago)

"Facts get tangled up in them". lol Thanks for conceding my point.

this post was submitted on 10 Dec 2024
181 points (98.9% liked)

Chapotraphouse

14186 readers
484 users here now

Banned? DM Wmill to appeal.

No anti-nautilism posts. See: Eco-fascism Primer

Slop posts go in c/slop. Don't post low-hanging fruit here.

founded 4 years ago
MODERATORS