384
Anon is waiting for Japan
(sh.itjust.works)
This is a place to share greentexts and witness the confounding life of Anon. If you're new to the Greentext community, think of it as a sort of zoo with Anon as the main attraction.
Be warned:
If you find yourself getting angry (or god forbid, agreeing) with something Anon has said, you might be doing it wrong.
https://lemmy.world/post/27126654/15901324
Edit to add: For the record, I am very interested in your arguments and would love to read the reports that has come to the conlusion that LLMs produce bad output. That's news to me (or I should say, a good prompt producing bad output. And what is considered bad and why?). So if you have a link to a report or something similar, please share. But don't claim that I am trying to construct a strawman when the THE VERY FIRST argument provided to me in this very comment chain was what I have talked about all along.
Edit 2: Here is the personal attack, the other point I disagree with: https://lemmy.world/post/27126654/15901907
See, that's your problem. You're arguing, with me, about something that was said to you by someone else. Do you realize why I'm questioning your argumentative skills?
Here's a source to a study about AI's accuracy as a search engine. The main use case proposed for LLMs as a tool is indexing a bunch of text, then summarizing and answering questions about it in natural language.
AI Search Has A Citation Problem
Another use is creating or modifying text based on an input or prompt, however, LLMs are prone to hallucinations. Here's a deep dive into what they are, why they occur and the challenges of dealing with them.
Decoding LLM Hallucinations: A Deep Dive into Language Model Errors
I don't know why do I even bother. You are just going to ignore the sources and dismiss them as well.
I'm sorry? You came to me.
Here is how I see it:
--
I don't have the time to read the articles now so I will have to do it later, but hallucinations can definitively be a problem. Asking for code is one such situation where an LLM can just make up functions that does not exist.