loose
Irony?
loose
Irony?
must of made a mistake their
your so dumb lmao
thank you kind stranger
Should of proof red it
I need to of a word with you
Knead*
Now when you submit text to chat GPT, it responds with “this.”
Unironically this
Criminaly underated post
As a language model, I laughed at this way harder than I should have
NTA, that was funny.
I'm waiting for it to start using units of banana for all quantities of things
ChatGPT trained used Reddit posts -> ChatGPT goes temporarily “insane”
Coincidence? I don't think so.
This is exactly what I was thinking.
And maybe some more people did what i did. Not deleting my accounts but replacing all my posts with content created by a bullshit-generator. Made texts look normal, but everything was completely senseless.
Back in june-july, I used a screen tapping tool + boost to go through and change every comment i could edit with generic type fill, then waited something like 2 weeks in hopes that all of their servers would update to the new text, and then used the same app to delete each comment and post, and then the account itself. Its about all I could think to do.
They have always trained on reddit data, like, gpt2 was, i'm unsure about gpt1
ChatGPT also chooses that guy's dead wife
The Narwhal Bacons at Midnight.
On the contrary, it'll becomes excessively perfectionist about it. Can't even say "could have" without someone coming in and saying "THANK YOU FOR NOT SAYING OF"
It already was, the only difference is that now reddit is getting paid for it.
It was already trained on Reddit posts. It's just now they're paying for it.
Its going to be a poop knife wielding guy with 2 broken arms out to get those jackdaws.
From now on, when you say something like "I think I can give my hoodie to my girlfriend", it will answer"and my axe""
ChatGPT was already trained on Reddit data. Check this video to see how one reddit username caused bugs on it: https://youtu.be/WO2X3oZEJOA?si=maWhUpJRf0ZSF_1T
And between were, we’re and where.
Insure and ensure.
"What is a giraffe?"
ChatGPT: "geraffes are so dumb."
“I have not been trained to answer questions about stupid long horses.”
Your right.
"Can't even breath"
And then and than.
And when it learns something new, the response will be "Holy Hell".
TIL
Is it a showerthought if it's actually just incorrect
Sure it might have some effect, but a big part of ChatGPT besides "raw" training data is RLHF, reinforcement learning from human feedback. Realistically, the bigger problem is training on AI-generated content that might have correct spelling, but hardly makes sense.
Then I did the right thing by replacing my texts with correct spelled nonsense.
The same for Gemini, Google brought its api
A "Showerthought" is a simple term used to describe the thoughts that pop into your head while you're doing everyday things like taking a shower, driving, or just daydreaming. A showerthought should offer a unique perspective on an ordinary part of life.