295
WhatsApp’s AI shows gun-wielding children when prompted with ‘Palestine’
(www.theguardian.com)
This is a most excellent place for technology news and articles.
Systemic prejudices showing up in datasets causing generative systems to spew biased output? Gasp.. say it isn’t so?
I’m not sure why this is surprising anymore. This is literally expected behavior unless we get our shit together and get a grip on these systemic problems. The rest of it all is just patch work and bandages.
I'd like to point out that not everything generative is a subset of all the ML stuff. So prejudices in datasets do not affect everything generative.
That's off the topic, just playing with such a thing as generative music now. Started with SuperCollider, but it was too hard (maybe not anymore TBF, probably recycling a phrase, for example, would be much easier and faster there than in my macaroni shell script) so now I just generate ABC, convert it to MIDI with various instruments, and use FluidSynth.