81
Google Gemini deletes user's code
(mashable.com)
A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.
Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.
Subcommunities on Beehaw:
This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.
Exactly, They’re just probabilistic models. LLMs are just outputting something that statistically could be what comes next. But that statistical process does not capture any real meaning or conceptualization, just vague associations of when words are likely to show up, and what order they’re likely to show up in.
What people call hallucinations are just the system functional capability diverging from their expectation of what it is doing. Expecting it to think and understand, when all it is doing is outputting a statistically likely continuation.