Top notch journalism. Even today, the "legit" sites either have an "I am over 18" button at best, and in general they just block users from states with more stringent requirements. Are we really supposed to hate seeders, just because arstechnica says so?
They're pointing out the double standard.
If you seed porn, its a federal offense.
If Meta does it, its capitalism.
While I agree with you to a point, they didn't stop there or even bother to really make that point at all. They are escalating the seeding of porn to the willful distribution of porn to children. The fact its a corporation doing the seeding just makes for an easy target for such escalation.
Ars Technica is not asserting that themselves, that's the argument that Strike3 is making. Strike3 and other porn companies attack non-professional porn on these grounds as well, to try to kill their competition.
For research purposes, which tracker did Meta use?
How funny would it be if that employee was on a different team and was torrenting for personal use and got caught up in this lol.
So what are they doing with the data? Is this all being fed into the LLM or image generating AI to create ultra realistic porn? To what end? I don't see their endgame unless it involves sexbots.
Pure speculation: ;possibly to identify sexual nudity and "inappropriate" content as some kind of legitimate usecase. What was actually done, I have no idea.
This feels most likely to me.
Meta doesn't exactly want to taint their brand image with purely sexual content being generated by their base models, so it's probably for either content classification, and/or the also likely fine-tuning of their LLMs and other generative models in reverse - that is to say, fine tuning them to not create content that is like what they're then being fed.
A lot of artists will practice anatomy by drawing people nude, largely because it’s hard to get a good understanding of anatomy by only drawing people with clothes on.
If you wanted to put some examples of bare human anatomy in odd positions to expand the range that the model is capable of, well there aren’t many larger corpuses of that than porn.
Also, even if they don’t want it to make explicit content, they probably want it to make “suggestive” or “appealing” content. And they just assume they can guide rail it away from making actual explicit content. Although that’s probably pretty short sighted given how weak guardrails really are.
Let's be honest now... Zuckerberg is building a globally-distributed, industrial-scale, disaster-proof spank bank for himself.
Well, Stable Diffusion 3 supposedly purposefully removed all porn from their training and negatively trained the model on porn and it apparently destroyed the model's ability to generate proper anatomy.
Regardless, image generation models need some porn in training to at least know what porn is so that they know what porn is not.
It's part of a process called regularization, or preventing any particular computational model from over-fitting.
Well, there's also censorbots.
Meta is based for once?
Technology
A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.
Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.
Subcommunities on Beehaw:
This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.