40
submitted 2 days ago by git@hexbear.net to c/technology@hexbear.net
you are viewing a single comment's thread
view the rest of the comments
[-] hello_hello@hexbear.net 6 points 2 days ago

All closed dataset LLMs must have absorbed large amounts of CSAM in their training data and that thought doesnt horrify people enough.

[-] lime@feddit.nu 1 points 1 day ago

probably, but even if not they can synthesize it no problem. an llm is an engine that allows you to do math on concepts, so if you multiply together two completely unrelated things you get a combination even if one does not exist in the dataset.

[-] Belly_Beanis@hexbear.net 1 points 1 day ago

These chat bots had to have been trained on actual conversations on Facebook/Instagram/whatever. As in, Zuckerberg and his goons know children are being preyed upon using their platforms, but they do nothing about it except use it as a source for data.

this post was submitted on 16 Aug 2025
40 points (100.0% liked)

technology

23917 readers
330 users here now

On the road to fully automated luxury gay space communism.

Spreading Linux propaganda since 2020

Rules:

founded 5 years ago
MODERATORS