143
Meta pirated and seeded porn for years to train AI, lawsuit says
(arstechnica.com)
A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.
Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.
Subcommunities on Beehaw:
This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.
So what are they doing with the data? Is this all being fed into the LLM or image generating AI to create ultra realistic porn? To what end? I don't see their endgame unless it involves sexbots.
Pure speculation: ;possibly to identify sexual nudity and "inappropriate" content as some kind of legitimate usecase. What was actually done, I have no idea.
This feels most likely to me.
Meta doesn't exactly want to taint their brand image with purely sexual content being generated by their base models, so it's probably for either content classification, and/or the also likely fine-tuning of their LLMs and other generative models in reverse - that is to say, fine tuning them to not create content that is like what they're then being fed.
A lot of artists will practice anatomy by drawing people nude, largely because it’s hard to get a good understanding of anatomy by only drawing people with clothes on.
If you wanted to put some examples of bare human anatomy in odd positions to expand the range that the model is capable of, well there aren’t many larger corpuses of that than porn.
Also, even if they don’t want it to make explicit content, they probably want it to make “suggestive” or “appealing” content. And they just assume they can guide rail it away from making actual explicit content. Although that’s probably pretty short sighted given how weak guardrails really are.
Let's be honest now... Zuckerberg is building a globally-distributed, industrial-scale, disaster-proof spank bank for himself.
Well, Stable Diffusion 3 supposedly purposefully removed all porn from their training and negatively trained the model on porn and it apparently destroyed the model's ability to generate proper anatomy.
Regardless, image generation models need some porn in training to at least know what porn is so that they know what porn is not.
It's part of a process called regularization, or preventing any particular computational model from over-fitting.
Well, there's also censorbots.