203
all 40 comments
sorted by: hot top controversial new old
[-] ryathal@sh.itjust.works 47 points 2 years ago

If you replace all your online images then ai can't look at it. No one else can either, but you stop ai I guess.

[-] Quexotic@sh.itjust.works 52 points 2 years ago
[-] stoy@lemmy.zip 23 points 2 years ago

Isn't Nightshade defeated by just applying an anti aliasing filter to the image?

[-] littlebluespark@lemmy.world 14 points 2 years ago

Yeah, this is some "I don't consent" FB post level of stupid shit. 🤷🏼‍♂️

[-] ryathal@sh.itjust.works 9 points 2 years ago

Even I'd it's not, it's using a pattern to "defeat" something that is mostly pattern recognition.

[-] Quexotic@sh.itjust.works 2 points 2 years ago

Makes sense. There will definitely be an adversarial situation going forward.

[-] wildginger@lemmy.myserv.one 1 points 2 years ago

Ever heard of the red queen hypothesis?

[-] stoy@lemmy.zip 1 points 2 years ago
[-] wildginger@lemmy.myserv.one 7 points 2 years ago

"it takes all the running in the world just to stay in place."

Normally refers to biology arms races, where a poisonous animal and a poison resistant predator play tit for tat, making stronger poisons and stronger resistances to try and outplay the other just to stay alive.

Now, artists and AI are doing the same. AI wants to steal art without paying for it, artists dont want their art stolen. Artists come up with little tricks to poison the data set if their art is used, AI comes up with little tricks to strip the poison from the data.

The dance continues, the dancers straining and struggling, all to stand still.

[-] stoy@lemmy.zip 2 points 2 years ago

Ah, similar struggle my dad described when negotiating sales of complex systems, both parties start with unrealistic demands, just to have stuff to give away to the other side during negotiations.

He has told me several times that he just wishes that the process was way more streamlined and that the parties could start closer to the realistic goal.

He has since retired, so he no longer needs to deal with it...

[-] noodlejetski@lemm.ee 2 points 2 years ago

I hope content hosting services start applying it by default.

[-] TheOneCurly@lemmy.theonecurly.page 20 points 2 years ago

I believe this is suggesting an AI poisoning edit, not removing the image entirely. It should be mostly imperceptible. Plus, you could update with newer methods as they come out.

[-] Justas@sh.itjust.works 4 points 2 years ago

You could generate a different temporary img url every time and nightshade it after the link expires.

[-] Justas@sh.itjust.works 2 points 2 years ago

You could generate a different temporary img url every time and nightshade it after the link expires.

[-] Deceptichum@kbin.social 32 points 2 years ago* (last edited 2 years ago)

I’m only vaguely familiar with ML datasets and have only trained on local data, but I’ve never heard of this? Can anyone provide some evidence this is the case?

Edit: Looking further I can still only find datasets containing the image files, ex.

https://www.lvisdataset.org/dataset

https://www.v7labs.com/open-datasets

[-] Ymmelbackwards@lemmy.world 17 points 2 years ago

LAION is one of the big dogs (https://laion.ai/). Their datasets consist of urls and metadata.

[-] Deceptichum@kbin.social 3 points 2 years ago

Ah perfect, thank you so much!

https://github.com/rom1504/img2dataset

Seems to be the main tool, I’ll have something new to explore this weekend.

[-] MargotRobbie@lemm.ee 13 points 2 years ago

LAION-5B is notoriously badly labeled that having a few poisoned data, even if it worked as advertised, would literally not matter at all.

Plus, it's not doing anything to existing diffusion models that used LAION-5B, since many artists are under the mistaken impression that the models will constantly scrapes the Internet for new images and train on them automatically, when training a model to learn new information without catastrophic forgetting is almost impossible (hence, workarounds like LoRAs and such).

Again, a reminder that the creator of Nightshade and Glaze, Ben Zhao of UChicago, is literally a code thief who stole GPL code for his closed source product (warning: reddit link) to scam artists who doesn't understand the tech behind ML models.

[-] Daxtron2@startrek.website 0 points 2 years ago* (last edited 2 years ago)

Tell me you have no idea how it works without telling me.

Edit: sorry that's unclear, I mean the OP on mastodon not you specifically

[-] thorbot@lemmy.world 2 points 2 years ago

"tell me without telling me" needs to fucking die

[-] Daxtron2@startrek.website 1 points 2 years ago

tell me you care too much without telling me

[-] AlexWIWA@lemmy.ml 10 points 2 years ago
[-] Thann@lemmy.ml 2 points 2 years ago

This is the preferred method

[-] RVGamer06@sh.itjust.works 2 points 2 years ago

c/foundsatan

[-] Hotzilla@sopuli.xyz 1 points 2 years ago

Fans of the actual content might have some issue with it

[-] GardenVarietyAnxiety@lemmy.world 8 points 2 years ago

TIL about Glazing and Nightshade. Thanks!

[-] Leate_Wonceslace@lemmy.dbzer0.com 4 points 2 years ago

They don't work as advertised.

[-] Mango@lemmy.world 2 points 2 years ago

I'm still confused.

[-] 31337@sh.itjust.works 1 points 2 years ago* (last edited 2 years ago)

Hmm, looks like it would also mess up classification, recommendation, captioning, etc models using these images. Maybe image and duplicate search as well? Maybe could be used to get around automated copyright strikes?

this post was submitted on 20 Jan 2024
203 points (92.1% liked)

People Twitter

7796 readers
304 users here now

People tweeting stuff. We allow tweets from anyone.

RULES:

  1. Mark NSFW content.
  2. No doxxing people.
  3. Must be a pic of the tweet or similar. No direct links to the tweet.
  4. No bullying or international politcs
  5. Be excellent to each other.
  6. Provide an archived link to the tweet (or similar) being shown if it's a major figure or a politician.

founded 2 years ago
MODERATORS