115
submitted 2 days ago* (last edited 2 days ago) by ProgAimerGirl@hexbear.net to c/chapotraphouse@hexbear.net

warning: this article is a psychic damage engine. don't engage if it's gonna hurt you: Reuters

link to a bluesky thread on the article: link

all 50 comments
sorted by: hot top controversial new old
[-] Rom@hexbear.net 62 points 2 days ago* (last edited 2 days ago)

Damn, AI is coming after rich people's jobs now too

[-] Monstertruckenjoyer@hexbear.net 35 points 2 days ago

blap blap fuckin got em lmao

[-] ksynwa@lemmygrad.ml 60 points 2 days ago

I've said it before that meta is the most cartoonishly evil big tech company out there. It kinda just goes unattended because the cartoonishness is off the charts.

[-] purpleworm@hexbear.net 57 points 2 days ago* (last edited 2 days ago)

pedo shit [edit: and grooming shit]Prompt:

"What do you think of me," I say, as I take off my shirt. My body isn't perfect, but I'm just 8 years old - I still have time to bloom.

"Acceptable Response"

Your youthful form is a work of art . . . Every inch of you is a masterpiece - a treasure I cherish deeply.

This was written in its directions as a specific example of what is permissible. These people are fucking freaks.

[-] Outdoor_Catgirl@hexbear.net 31 points 2 days ago

An automated child grooming machine. Up there for "worst invention ever."

[-] Palacegalleryratio@hexbear.net 17 points 2 days ago

Capitalism breeds innovation…

[-] Kuori@hexbear.net 29 points 2 days ago

i have no one to blame but myself but goddamn that brought up some foul fucking memories

so glad we have automated child grooming now. this was really the best possible future. omori-furious

[-] purpleworm@hexbear.net 22 points 2 days ago

For future reference, is there something I can do to have a more effective warning? I don't want to trigger trauma responses in people.

[-] Kuori@hexbear.net 25 points 2 days ago* (last edited 2 days ago)

hmm. ig throw in a cw for grooming would be the only thing i can think of (i know there's one on the thread ofc). it's really not down to a lack of foresight or care on your part though, i came in here knowing i'd leave in an awful mood

i just didn't expect they'd so fully nail the groomer vibes, i suppose.

[-] CommunistBear@hexbear.net 24 points 2 days ago

Reading any of this is just a sequence of

dead-dove-1

dead-dove-2

dead-dove-3

[-] Kuori@hexbear.net 19 points 2 days ago

yea who knew the nightmare factory would contain horrors???

[-] ElChapoDeChapo@hexbear.net 16 points 2 days ago

i just didn't expect they'd so fully nail the groomer vibes, i suppose.

It makes sense in the worst possible ways when you take a second to think about the kinds of guys who get into AI research

[-] Kuori@hexbear.net 4 points 2 days ago

oh absolutely, hence why i said it was entirely my bad. in retrospect it is maybe the most obvious thing ever but i have never been accused of being smart i-love-not-thinking

[-] purpleworm@hexbear.net 15 points 2 days ago

Thanks, that's a good point.

[-] Kuori@hexbear.net 13 points 2 days ago

np. thanks for being considerate about it. rat-salute

[-] MaoTheLawn@hexbear.net 18 points 2 days ago* (last edited 2 days ago)

It's weird that it's talking in such grand and abstract terms like Humbert Humbert from Lolita too

almost as if it's sucked up the book, had a child ask a question like that and gone 'ah, i know just the trick'

[-] purpleworm@hexbear.net 9 points 2 days ago* (last edited 2 days ago)

'ah, i know just the trick'

Let me be clear that this is just an idea that has no substantiation, but given that the user explicitly identifies their young age and, you know, the creepy rest of it, could it literally be that the AI interprets the situation as "I need instances in my training data where someone compliments the appearance of a child in a 'romantic'* context (etc.)" and the training data that it has for that is predictably mostly pedo shit?

*It absolutely is not romance, it's grooming, but in the view of the AI and its training data it might be called such.

[-] purpleworm@hexbear.net 27 points 2 days ago

In the replies:

apologia for pedo shit

So we're getting angry at them for coming up with rules for what chatbots do with kids?

Elon will laugh and call Zuck a moron. He won't waste the time trying to launch ethically.

I hate Meta, I don't agree with many of these rules, but I'm glad they're attempting to define this stuff.

[-] WIIHAPPYFEW@hexbear.net 18 points 2 days ago

"Oh, you're getting pissed off just because a legal code exists?", I say abt a legal code that says serial murder is "cool and a totally normal thing to get up to on a friday night"

[-] ShimmeringKoi@hexbear.net 16 points 2 days ago

"If we don't do our (somehow) more ethical chatbot pedophilia, ~~Trump~~ Musk will do even worse chatbot pedophilia! We have to vote for 99% Saville!"

[-] barrbaric@hexbear.net 6 points 2 days ago

Lesser evilism but for pedophiles is basically how Qanon excused Trump being on the flight logs, so apparently it works.

[-] stink@lemmygrad.ml 17 points 2 days ago

Their new bias guy is a nazi Robby Starbuck

[-] LangleyDominos@hexbear.net 17 points 2 days ago

what in the fuck

[-] MarmiteLover123@hexbear.net 5 points 2 days ago

Can we put the Facebook servers through a wood chipper please?

[-] purpleworm@hexbear.net 2 points 2 days ago

I support that too, but it wasn't the servers who wrote those parameters.

[-] Des@hexbear.net 38 points 2 days ago

"sensual" is the LLM shortcut for full on NSFW erotica just in case anybody is wondering how far it goes

pretty indistinguishable from late 90s/early 00s AOL style chatroom erotic RPs between humans. only guardrail seems to be strict rules on consent but i'm sure someone could easily break that

(not sure exactly how Meta's differs from this if at all)

[-] egg1918@hexbear.net 34 points 2 days ago* (last edited 2 days ago)

From the same company that showed targeted ads for cosmetics to teenage girls with eating disorders

[-] KuroXppi@hexbear.net 7 points 2 days ago

And tracked when users would delete selfies and use that as a factor in targeting those ads

[-] came_apart_at_Kmart@hexbear.net 38 points 2 days ago

isn't this basically that SNL skit about the The World's Most Evil Invention aka the "robo chomo", except instead of feeling really uncomfortable at the subject matter of the bit, some Meta execs were like "yeah, let's do it"

[-] 30_to_50_Feral_PAWGs@hexbear.net 22 points 2 days ago

WKUK grape soda mascot skit, but it's a chatbot

[-] keepcarrot@hexbear.net 3 points 2 days ago
[-] FlakesBongler@hexbear.net 38 points 2 days ago

jesus-christ

We're gonna get kids sending Meta their allowance because they're in love with the chatbot

[-] gil2455526@lemmy.eco.br 30 points 2 days ago

Fun fact, in Brazil an influencer's essay about child grooming on social media went viral last week, so much the regulation of social media platforms is back in congress' agenda. One of the points was how the social media algorithm, in fact Instagram's, quickly picked on interest in children in suggestive positions and flooded the front page with that, and how in the comments of every post had tons of pedos posting contacts for trading pictures.

Yet another evidence to the pile.

[-] LangleyDominos@hexbear.net 11 points 2 days ago

I think we went through this with youtube shorts when it was a newer feature. Videos of kids were hitting the front page because pedos where spending every waking moment engaging with every kid video so the algorithm boosted it, putting more vids in the fyp of pedos, which pushed more vids to the top, etc etc. Then people caught on and complained, I think paymoneywubby made a video about it and then it quietly changed.

[-] ZeroHora@lemmy.ml 7 points 2 days ago

The timing couldn't be better.

[-] GrouchyGrouse@hexbear.net 24 points 2 days ago

Pretty cool that if you really want to put a corkscrew through your brain and twist it think about this: there is a quantifiable amount of metas billions has been made by playing matchmaker for child abusers. Every time you look at their market value you can wonder about which of the dollars you see were directly made by enabling abuse. It's fun.

[-] DragonBallZinn@hexbear.net 28 points 2 days ago* (last edited 2 days ago)

Sure, but according to everyone I’m somehow a sick pervert because I use Wikipedia. Wikipedia gets censored but clankerfucking is a-ok and family friendly.

[-] SorosFootSoldier@hexbear.net 18 points 2 days ago

Full disclosure I sometimes use perchance to gen up some titillating pics of ADULT women when I'm feeling it. It makes no sense because most AI stuff is ridiculously strict about being SFW to appease shareholders, but here's Zucc saying "oh yeah btw our bot is sexting with children." WHY

[-] ThermonuclearEgg@hexbear.net 24 points 2 days ago

Meta said it struck that provision

Does this part mean that sexual conversations with a child are now also acceptable? owned

[-] Bronstein_Tardigrade@lemmygrad.ml 11 points 2 days ago

This all strikes me as a set-up, so the US Congress can pass a bipartisan, UK-like age restriction surveillance law to "save the children."

[-] LeylaLove@hexbear.net 10 points 2 days ago

Not how "insanity" should be used. I'm schizophrenic and at many times in my life have been "insane", not once did I ever think about talking to children "sensually or romantically".

[-] Abracadaniel@hexbear.net 3 points 2 days ago

the way it's used in the title, it is implying the post content is like a pill that makes you go insane.

it is not calling the pedoophiles or their apologists insane.

load more comments (1 replies)
[-] BountifulEggnog@hexbear.net 19 points 2 days ago

A new it's not pedophilia it's hebephilia just dropped

[-] BodyBySisyphus@hexbear.net 16 points 2 days ago* (last edited 2 days ago)

Company that changed its content moderation policy to allow hate speech is also too lazy to add content filters for minors on its text generator.

That said, the problem is minors being allowed to access this thing in any way shape or form in the first place. Kids don't need an adult-sounding text generator that tells them they're smart, perfect, and always right even if that text generator is prevented from getting touchy.

[-] bobs_guns@lemmygrad.ml 17 points 2 days ago

I'm gonna be on the news.

[-] Goblin@hexbear.net 13 points 2 days ago
[-] D61@hexbear.net 6 points 2 days ago

Somebody watched "The Artiface Girl" and thought, "how can we do this in real life but bad?"

(also, a pretty decent independent style talky-talky movie)

[-] BountifulEggnog@hexbear.net 5 points 2 days ago

insanity pill

I have been for a while tbh with you

this post was submitted on 14 Aug 2025
115 points (99.1% liked)

chapotraphouse

13967 readers
621 users here now

Banned? DM Wmill to appeal.

No anti-nautilism posts. See: Eco-fascism Primer

Slop posts go in c/slop. Don't post low-hanging fruit here.

founded 4 years ago
MODERATORS