324

cross-posted from: https://hexbear.net/post/2609634

Hmmmm… I don’t remember that Beatles song

you are viewing a single comment's thread
view the rest of the comments
[-] TrickDacy@lemmy.world 82 points 5 months ago

Another option would be to not lie because you think it's cool to.

[-] Beryl@lemmy.world 64 points 5 months ago

Juste because yours is genuine doesn't mean theirs can't also be. That's the beauty of LLMs. They're just stochastic parrots.

[-] TrickDacy@lemmy.world 21 points 5 months ago

Yeah maybe, it's just that after seeing several posts like this and never being able to reproduce it, it makes me think people are just mad at Google

[-] sorter_plainview@lemmy.today 10 points 5 months ago

Well usual pattern is there will be one genuine case, the pizza story for example, remaining are usually fakes or memes generated. I just enjoy them just as I enjoy a meme.

[-] cypherpunks@lemmy.ml 30 points 5 months ago* (last edited 5 months ago)

shoutout to the multiple people flagging this post as misinformation 😂

(I don't know or care if OP's screenshot is genuine, and given that it is in /c/shitposting it doesn't matter and is imo a good post either way. and if the screenshot in your comment is genuine that doesn't even mean OP's isn't also. in any case, from reading some credible articles posted today on lemmy (eg) I do know that many equally ridiculous google AI answer screenshots are genuine. also, the song referenced here is a real fake song which you can hear here.)

[-] TokenBoomer@lemmy.world 9 points 5 months ago

Let’s not forget that the Beatles advocated for babies driving cars.

[-] TrickDacy@lemmy.world 9 points 5 months ago

Mine is genuine, take it or leave it

I often find these kinds of posts to not be reproducible. I suspect most are fake

[-] assaultpotato@sh.itjust.works 5 points 5 months ago* (last edited 5 months ago)

Depends on the temperature in the LLM/context, which I'm assuming google will have set quite low for this.

[-] TrickDacy@lemmy.world 2 points 5 months ago
[-] assaultpotato@sh.itjust.works 5 points 5 months ago

Yeah, it's kind of a measure of randomness for LLM responses. A low temperature makes the LLM more consistent and more reliable, a higher temperature makes it more "creative". Same prompt on low temperature is more likely to be repeatable, high temperature introduces a higher risk of hallucinations, etc.

Presumably Google's "search suggestions" are done on a very low temperature, but that doesn't prevent hallucinations, just makes it less likely.

this post was submitted on 24 May 2024
324 points (87.0% liked)

shitposting

1602 readers
88 users here now

Rules •1. No Doxxing •2. No TikTok reposts •3. No Harassing •4. Post Gore at your own discretion, Depends if its funny or just gore to be an edgelord.

founded 3 years ago
MODERATORS