296
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
this post was submitted on 10 May 2024
296 points (99.3% liked)
Not The Onion
12311 readers
450 users here now
Welcome
We're not The Onion! Not affiliated with them in any way! Not operated by them in any way! All the news here is real!
The Rules
Posts must be:
- Links to news stories from...
- ...credible sources, with...
- ...their original headlines, that...
- ...would make people who see the headline think, “That has got to be a story from The Onion, America’s Finest News Source.”
Comments must abide by the server rules for Lemmy.world and generally abstain from trollish, bigoted, or otherwise disruptive behavior that makes this community less fun for everyone.
And that’s basically it!
founded 1 year ago
MODERATORS
When a human in a plane drops a bomb on a school full of kids, we don't charge anyone with a war crime. Why would we start charging people with war crimes when we make the plane pilotless?
The autonomy of these killer toys is always overstated. As front-line trigger pullers, they're great. But they still need an enormous support staff and deployment team and IT support. If you want to blame someone for releasing a killer robot into a crowd of civilians, its not like you have a shortage of people to indict. No different than trying to figure out who takes the blame for throwing a grenade into a movie theater. Everyone from the mission commander down to the guy who drops a Kill marker on the digital map has the potential for indictment.
But nobody is going to be indicted in a mission where the goal was to blow up a school full of children, because why would you do that? The whole point was to murder those kids.
Israelis already have an AI-powered target-to-kill system, after all.
Literally the entire point of this system is to kill whole families.