95
Deepfake Porn Is Out of Control
(www.wired.co.uk)
This is a most excellent place for technology news and articles.
I don't think that's at all similar. "Boys will be boys" is "we know it's bad, but we can't stop them."
The argument is... is it really bad? After all, isn't it the "scandal" that really causes the damage? It's not like any harm is directly done to the person, someone could've already done this to me, and well, I wouldn't be any the wiser. It's when they start sharing it and society reacts as if it's real and there's something scandalous that there's a problem.
If we stop considering it scandalous... The problem kind of goes away... It's not much different than AI photoshopping a hat on someone that they may or may not approve of.
I've never researched these tools or used them... But I'd wager that's going to be next to impossible. If you think the war on drugs was bad... A war on a particular genre of software would be so much worse.
Like a lot of things... I think this is a question of how do we adapt to new technology not how do we stop it. If I actually believed this was stoppable, I might agree with you... But it actually seems more dangerous if we try and make the tools hard to obtain vs just giving people plausible deniability.
You mentioned bullying, definitely empathetic to that. I don't know that this would really make things worse vs the "I heard Katie ..." rumor crap that's being going on for decades. Feminism has argued for taking the power away by removing the taboo of women having sex lives ... and that seems equally relevant here.
Either way, it really seems like a lot more research is needed.
"Just stop considering it scandalous" is a severe lack of imagination. Even if/when the stigma of "having a sex life" is gone, the great majority of people consider their sex life to be private. Video floating around that looks like you having sex is a very different thing to hearsay rumors.
Keep in mind that the exact same techniques could be used to sabotage adult relationships, marriages, careers, just as easily as teenage bullying. This isn't a problem society can shrug away by saying sex should be less stigmatized.
And a "video" should ruin those things why?
Literally everything you listed is because society is making a big stink of things that don't matter.
Why should your job care ... even if it's real?
If somebody didn't cheat and there's no other reason to believe that than a ... suspiciously careless video of someone that looks like them... Why in the world should that end their relationship?
Not to mention, AI isn't going to get the details right. It's going to get the gist of it right but anyone who's actually seen you naked is presumably going to be able to find some details that are/aren't off.
Also in terms of privacy, your privacy wasn't violated. Someone made a caricature of you.
It's really not, the only reason it is, is because video has been trustworthy for the past century, now it's not.
I hope you folks down voting me have some magic ace up your sleeve, but I see no way past this other than through it. Just like when the atom bomb was invented, it's technology that exists now and we have to deal with it. Unlike the atom bomb, it's just a bunch of computer code and at some point pretty much any idiot is going to be able to get their hands on convincing versions of it. Also unlike the atomic bomb, it can't actually kill you.