324
top 50 comments
sorted by: hot top controversial new old
[-] flubba86@lemmy.world 91 points 1 year ago* (last edited 1 year ago)

Kinda weird that it details how badly this affected the girls' mothers. The girls don't get a say, but won't someone please think of the mothers?!

[-] NotAPenguin@kbin.social 33 points 1 year ago

It is pretty weird, like "The reaction was one of massive support for all the affected mothers"..?

[-] ParsnipWitch@feddit.de 16 points 1 year ago* (last edited 1 year ago)

How do the girls not get a say? They asked their mothers for help who organised to found others who are affected.

[-] EnderMB@lemmy.world 2 points 1 year ago

I imagine it leans into the idea of some people being "too young" to form a grown-up opinion.

Really fucking weird, given the context is around their likeness being used for the purpose of porn.

[-] erranto@lemmy.world 43 points 1 year ago* (last edited 1 year ago)

Maybe be We should shift our thinking to assume that everything posted on the internet is Fake. that's the only solution to counter the proliferation of AI. the genie is out of the bottle and can't be forced back

only believe information from official sources that is cryptographically signed.

[-] stevedidwhat_infosec@infosec.pub 10 points 1 year ago

This is a a really really over simplified solution and I’m gonna argue it’s not at all effective. The cat is out of the bag, just like you said. You can’t undo that.

Nothing on the internet is real, okay let’s start from there.

So now how do we relay scientific findings for example? Rely on the media? Pray to some god and hope our reasoning and interpretations are correct enough?

Should we trust video recordings? Pictures?

Should we trust word of mouth? Each other? Ourselves?

[-] Vqhm@lemmy.world 3 points 1 year ago

Eh, most dashcams have metadata with GPS, timestamps, etc.

GPS locations, time, that's just math tho. But we could put a private key on every camera and digitally sign every photo/video to prove where it came from.

If it gets bad enough you'll just carry around a film camera and snap a photo of an accident to prove your dashcam was "real."

Official news sources, such as the AP, aren't going to just start faking shit. So look towards media that has a reputation. Yellow journalism has always been around and always will. But one advacent in faking whatever does not mean countermeasures stop advancing.

[-] atx_aquarian@lemmy.world 3 points 1 year ago

But we could put a private key on every camera and digitally sign every photo/video to prove where it came from.

Unfortunately, the movie Freeze Frame is becoming increasingly relevant.

I’m going to push back mainly on your last point. “News agencies are just going to start faking shit”

Why not? What happened to CNN? What’s to stop anyone from buying out the news agency and steering the ship elsewhere? Obviously they wouldn’t want to be obvious about it but a little change here and there and you’d be surprised what you can convince people of.

As a society we are incredibly susceptible to social influence and it’s only going to get easier. I can’t rely on the past to predict the future accurately when the medium is so easily changeable.

[-] Fraylor@lemm.ee 1 points 1 year ago

Thanks to deepfakes basically none of it is trustworthy anymore.

load more comments (2 replies)
[-] MonkderZweite@feddit.ch 36 points 1 year ago* (last edited 1 year ago)

Violates the right to your own image. You are not allowed to upload images of a classmate to an AI cloud without asking and neither to reach the generated images around.

[-] MartianSands@sh.itjust.works 23 points 1 year ago

Is that an actual legal right? If you've described it accurately, then Facebook and Instagram would be completely illegal

[-] rentar42@kbin.social 12 points 1 year ago

It depends on your location, different countries have very different laws.

For example in most countries it's perfectly acceptable to have someone in a picture that you're taking in public (for example you're taking a picture of a building and someone happens to walk by). A notable exception to this is France, where apparently the right to ones own image is quite strong which technically makes most pictures of the Eiffel Tower illegal (as long as any one person is identifiable on it).

Taking (and distributing) a picture specifically of a specific person that's just doing random stuff in public is already less uniform and varies. There's often some protection to basically say "no, you can't make fun of some random person for having the wrong tshirt, they have a right to privacy". A notable exception to that is usually "public figures" (which mostly means people in political, religious or commercial leadership positions): they mostly just have to accept to be pictured wherever.

Protection for pictures taken in a private is usually the strongest (so yes, if you post a picture of your 3 best friends at a small party in your home, you might have to ask them for permission!)

How all of this applies to pictures that "aren't real" but look disturbingly so is probably going to be fought over in court for a good while.

[-] MonkderZweite@feddit.ch 2 points 1 year ago* (last edited 1 year ago)

No, human right. And yeah, they mostly are. But it's not Facebook offending but each of the teens, so nobody can really enforce it. Same like with phone numbers, except that those are actually protected by law in most countries.

[-] CrayonRosary@lemmy.world 5 points 1 year ago* (last edited 1 year ago)

~~AI generates novel images, though. They are merely trained to produce your likeness. None of the pixels are from any source images.~~

In this case, I'm mistaken. They used a clothing remover app on normal photos and did not train an AI.

[-] stevedidwhat_infosec@infosec.pub 4 points 1 year ago* (last edited 1 year ago)

Jesus Christ that’s not even close to AI they literally stitched together shit like photoshop

By the way, where do you see that clothoff ( the app mentioned in the article) doesn’t use trained AI models? I’m refraining from visiting their site to check myself as I don’t really want to give them that traffic and I figured I’d ask you direct instead as you already went their to verify I assume)

[-] CrayonRosary@lemmy.world 1 points 1 year ago

I didn't say the app doesn't use trained models. I said the students didn't themselves train an embedding or LORA against the other students' faces in order to generate entirely new pics.

load more comments (1 replies)
[-] JackGreenEarth@lemm.ee 19 points 1 year ago

You can't stop them being made, they're just the same deepfakes people have been making before. It's important to note that they're not photos of people, they're guesses made by a algorithm.

[-] strider@feddit.nl 62 points 1 year ago

While you're completely right, that's hardly a consolation for those affected. The damage is done, even if it's not actually real, because it will be convincing enough for at least some.

[-] PunnyName@lemmy.world 5 points 1 year ago

While I understand your point, what consolation can be provided?

[-] ParsnipWitch@feddit.de 12 points 1 year ago

I think the people who made the pictures have to suffer consequences. Otherwise this sends the message as if it was just fair game to behave that way.

load more comments (2 replies)
[-] n0m4n@lemmy.world 19 points 1 year ago* (last edited 1 year ago)

The faces are not generated, and that is where the damage comes. It targets the girls for humiliation by implying that they allowed the nudes to be taken of them. Depending upon the location and circumstances, this could get the girls murdered. Think of "honor killings" by fundamentalists. It makes them targets for further sexual abuse, too. Anyone distributing the photos are at fault, as well as the people who made the photos.

The problem goes deeper, though. We can never trust a photo as proof of anything, again. Let that sink in, what it means to society.

[-] maegul@lemmy.ml 14 points 1 year ago

To push back your attempt to minimalise what's going on here ...

Yes, they're not actually photos of the girls. But, nor is a photo of a naked person actually the same as that person standing in front of you naked.

If being seen naked is unwanted and embarrassing etc, why should a photo of you naked be embarrassing, and, to make my point, what difference would it make if the photo is more or less realistic? An actual photo can be processed or taken under certain lighting or with a certain lens or have been taken some time in the past ... all factors that lessen how close it is to the current naked appearance of the subject. How unrealistic can a photo be before it's no longer embarrassing?

Psychologically, I'd say it's pretty obvious that the embarrassment of a naked image is that someone else now has a relatively concrete image in their minds of what the subject looks like naked. It is a way of being seen naked by proxy. A drawn or painted image could probably have the same effect.

There's probably some range of realism within which there's an embarrassing effect, and I'd bet AI is very capable of getting in that range pretty easily these days.

While the technology is out there now ... it doesn't mean that our behaviours with it are automatically acceptable. Society adapts to the uses and abuses new technology has and it seems pretty obvious that we're yet to culturally curb the abuses of this technology.

load more comments (2 replies)
[-] Rayspekt@kbin.social 6 points 1 year ago

Exactly, the technology is out there and will not cease to exist. Maybe we'll digitally sign our photos in the future so that deepfakes can be sorted out by that.

[-] Redditiscancer789@lemmy.world 17 points 1 year ago

Omg it's NFTs time to shine!!!!

/S

[-] drbluefall@toast.ooo 5 points 1 year ago

Will everyone be expected to have some kind of official PGP key?

[-] xc2215x@lemmy.world 12 points 1 year ago

That is very messed up.

Inevitable. Our technology outpaced our evolution a long time ago. We’re spiraling.

[-] rentar42@kbin.social 4 points 1 year ago

I think that happened at least 10k years ago ... it's just that the spiral is getting faster and faster ...

[-] PsychedSy@sh.itjust.works 1 points 1 year ago

That's always the case. Evolution, both biological and societal, happens after the environment changes.

load more comments
view more: next ›
this post was submitted on 20 Sep 2023
324 points (97.6% liked)

World News

39040 readers
2206 users here now

A community for discussing events around the World

Rules:

Similarly, if you see posts along these lines, do not engage. Report them, block them, and live a happier life than they do. We see too many slapfights that boil down to "Mom! He's bugging me!" and "I'm not touching you!" Going forward, slapfights will result in removed comments and temp bans to cool off.

We ask that the users report any comment or post that violate the rules, to use critical thinking when reading, posting or commenting. Users that post off-topic spam, advocate violence, have multiple comments or posts removed, weaponize reports or violate the code of conduct will be banned.

All posts and comments will be reviewed on a case-by-case basis. This means that some content that violates the rules may be allowed, while other content that does not violate the rules may be removed. The moderators retain the right to remove any content and ban users.


Lemmy World Partners

News !news@lemmy.world

Politics !politics@lemmy.world

World Politics !globalpolitics@lemmy.world


Recommendations

For Firefox users, there is media bias / propaganda / fact check plugin.

https://addons.mozilla.org/en-US/firefox/addon/media-bias-fact-check/

founded 1 year ago
MODERATORS