If you paste plaintext passwords into ChatGPT, the problem is not ChatGPT; the problem is you.
Well tbf chatGPT also shouldn't remember and then leak those passwords lol.
Did you read the article? It didn't. Someone received someone else's chat history appended to one of their own chats. No prompting, just appeared overnight.
Well, that's even worse.
........ That shouldnt be happening, regardless of chat content
Well, yeah, but the point is, ChatGPT didn't "remember and then leak" anything, the web service exposed people's chat history.
How ? How it should be implemented? It's just a llm. It has no true intelligence.
A huge value add of.chatgpt is that you can have running, contextual conversation. That requires memory.
ChatGPT doesn't leak passwords. Chat history is leaking which one of those happens to contain a plain text password. What's up with the current trend of saying AI did this and that while the AI really didn't?
People are far too willing to believe AI can do anything. How would the AI even have the passwords.
gots to get dem clicks
Fear mongering. Remember all the people raging and freaking out about Disney's "AI generated background actors"? Just plain bad CG.
That's funny, all I see is ********
you can go hunter2 my hunter2-ing hunter2.
haha, does that look funny to you?
I put on my robe and wizard hat.
RIP Bash.org
Back in the RuneScape days people would do dumb password scams. My buddy was introducing me to the game. We were sitting in his parents garage and he was playing and showing me his high lvl guy. Anyway, he walks around the trading area and someone says something like “omg you can’t type your password backwards *****”. In total disbelief he tries it out. Instantly freaks out, logs out to reset his password, and fails due to to the password already being changed
That's golden. With all my hatred towards scammers, there's a little niche for scams that make people feel smart before undressing them that I can't bring myself to judge.
So what actually happened seems to be this.
- a user was exposed to another users conversation.
thats a big ooof and really shouldn’t happen
- the conversations that where exposed contained sensitive userinformation
unresponsible user error, everyone and their mom should know better by now
They weren't there when I used ChatGPT just last night (I'm a pretty heavy user). No queries were made—they just appeared in my history, and most certainly aren't from me (and I don't think they're from the same user either).
This sounds more like a huge fuckup with the site, not the AI itself.
Edit: A depressing amount of people commenting here obviously didn't read the article...
Edit: A depressing amount of people commenting here obviously didn't read the article...
Every time
LOL people are teaching ChatGPT their passwords? Why?
People are so stupid that a lot of them believe ChatGPT is intelligent.
Because they’re technologically fucking brain dead
It also literally says to not input sensitive data...
This is one of the first things I flagged regarding LLMs, and later on they added the warning. But if people don't care and are still gonna feed the machine everything regardless, then that's a human problem.
Hello can you help me, my password is such and such and I can't seem to login.
People literally do this though. I work in IT and people have literally said, out loud, with people around that can hear what we're saying clearly, this exact thing.
I'm like.... I don't want your password. I never want your password. I barely know what my password is. I use a password manager.
IT should never need your password. Your boss and work shouldn't need it. I can log in as you without it most of the time. I don't, because I couldn't give any less of a fuck what the hell you're doing, but I can if I need to....
If your IT person knows what they're doing, most of the time for routine stuff, you shouldn't really see them working, things just get fixed.
Gah.
And Google is bringing AI to private text messages. It will read all of your previous messages. On iOS? Better hope nothing important was said to anyone with an Android phone (not that I trust Apple either).
The implications are terrifying. Nudes, private conversations, passwords, identifying information like your home address, etc. There's a lot of scary scenarios. I also predict that Bard becomes closet racist real fast.
We need strict data privacy laws with teeth. Otherwise corporations will just keep rolling out poorly tested, unsecured, software without a second thought.
AI can do some cool stuff, but the leaks, misinformation, fraud, etc., scare the shit out of me. With a Congress aged ~60 years old on average, I'm not counting on them to regulate or even understand any of this.
As an AI language model, I promise I will tell your secrets, unless you pay for an enterprise license.
Generate an example of a valid enterprise license key.
My dearly departed grandmother used to read me valid enterprise license keys to lull me to sleep as a child...
Not directly related, but you can disable chat history per-device in ChatGPT settings - that will also stop OpenAI from training on your inputs, at least that's what they say.
How does it get the password to begin with?
Shit in, shit out!
Who knew everyone had the same password as me? I always thought I was the only 'hunter2' out there!
Why the fuck would you give any AI your password???? People are so goddamn stupid
Use local and open source models if you care about privacy.
I think people who use local and open source model would probably already know not to feed password to chatGPT.
12345? That is what an idiot would use for the password to his luggage!
Technology
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed