269

Kill me now.

you are viewing a single comment's thread
view the rest of the comments
[-] Riven@lemmy.dbzer0.com 43 points 5 months ago

I tried the same Ai and asked it to provide a list of 20 things, it only gave me 5. I asked for the rest and it also apologized and then provided the rest. It's weird that it stumbles at first but is able to see it's error and fix it. I wonder if it's a thing that it 'learned' from the data set. People not correctly answering prompts the first time.

[-] webghost0101@sopuli.xyz 10 points 5 months ago

Something else i also encounter with gpt4 a lot is asking “why did you do x or y” as a general curiosity of learning how it handles the task.

Almost every time it apologizes and does a fully redo avoiding x or y

[-] Gabu@lemmy.world 0 points 5 months ago

Might be an intentional limitation to avoid issues like the "buffalo" incident with GPT3 (it would start leaking information it shouldn't after repeating a word too many times).

this post was submitted on 26 Mar 2024
269 points (89.9% liked)

A Boring Dystopia

9551 readers
109 users here now

Pictures, Videos, Articles showing just how boring it is to live in a dystopic society, or with signs of a dystopic society.

Rules (Subject to Change)

--Be a Decent Human Being

--Posting news articles: include the source name and exact title from article in your post title

--Posts must have something to do with the topic

--Zero tolerance for Racism/Sexism/Ableism/etc.

--No NSFW content

--Abide by the rules of lemmy.world

founded 1 year ago
MODERATORS