142
all 33 comments
sorted by: hot top controversial new old
[-] context@hexbear.net 58 points 1 month ago

this is from a couple years ago. everyone is here thinking this is like minority report but it's not about individual behavior, this is just about using predictive models to self-justify racist policing patterns:

Fears of “The Thought Police” are probably running through your head right now, but Chattopadhyay is keen to stress that the focus isn’t on individuals, “We do not focus on predicting individual behavior, and do not suggest that anyone be charged with a crime that they didn’t commit, or be incarcerated for that. Our model learns from, and then predicts, event patterns in the urban space, for example, predicting that within a two block radius around the intersection of 67th Street and SW Avenue, there is a high risk of a homicide a week from now. It does not indicate who is going to be the victim or the perpetrator,” she says.

https://archive.ph/zgUjs

[-] MolotovHalfEmpty@hexbear.net 45 points 1 month ago

Which of course has a certain degree of success baked in, because if you focus policing in a particular place you will find crimes there because a) crimes happen everywhere and b) cops can just juke the stats / make shit up / make arrests without cause.

[-] context@hexbear.net 34 points 1 month ago

exactly. it's amazing to me that these nerds can talk themselves into creating an ouroboros like this because they don't actually bother to understand how any of this shit works, but i guess whatever justifies their salary...

[-] Infamousblt@hexbear.net 5 points 1 month ago* (last edited 1 month ago)

It's the result of other scientists pretending sociology isn't a science. Sociology makes shit like this worthless, so instead of just working together with sociologists, they ignore them.

[-] CDommunist@hexbear.net 34 points 1 month ago

very-smart "using advanced AI, we have determined there will be more crime in the high crime area"

[-] context@hexbear.net 23 points 1 month ago* (last edited 1 month ago)

it's even worse than that! they're treating crimes like they're forces of nature or fucking dice rolls to begin with and completely ignore the role police play in defining and creating crime and the construction of criminality!

i mean garbage in, garbage out, and the whole edifice is built upon a giant pile of racist garbage and these assholes will happily congratulate themselves about how good at math they are

[-] TheLastHero@hexbear.net 6 points 1 month ago

AI bringing back miasma theory, past crimes are creating bad odors in the area that are just turning previously pure citizens into criminals. I hope the government gives the police more military equipment to purge these evil vapors

[-] context@hexbear.net 5 points 1 month ago

exactly but we call it "broken window theory" to jazz it up a bit

[-] fox@hexbear.net 42 points 1 month ago

With 90% accuracy it will successfully identify 90 out of 100 criminals and falsely accuse 90 out of 1000 innocent people.

[-] AntiOutsideAktion@hexbear.net 22 points 1 month ago

It's better for ten innocent people be jailed than for one full time wage to be paid

[-] underwire212@lemm.ee 4 points 1 month ago

10 innocent people may be jailed, but it’s a risk I’m willing to take

[-] Wolfman86@hexbear.net 5 points 1 month ago

I see anywhere where prisons are a private thing being massively in favour of this.

pub fn predict_crime(suspect: Person) -> bool {
   if suspect.race() == Race::Black {
       return true;
   } else {
       return false;
   }
}
[-] huf@hexbear.net 24 points 1 month ago

ew...

pub fn predict_crime(suspect: Person) -> bool {
   return suspect.race() == Race::Black
}
[-] TheDoctor@hexbear.net 16 points 1 month ago

Good change but also why is race a getter method while Race::Black is a constant enum? Is race an impure function dependent on global state? Is it derived from some other internal immutable state?

race() is a getter method as it is dependent on which Eastern and Southern Europeans are considered white at the time

[-] ProletarianDictator@hexbear.net 2 points 1 month ago

you dont need the return statement either

[-] huf@hexbear.net 2 points 1 month ago

i dont even know what language this is :D i just thought it'd be a nice bit to silently pass over the racism aspect and nitpick the code

[-] ProletarianDictator@hexbear.net 2 points 1 month ago

It's Rust.

If you omit the semicolon on the last line, it will return that value, so suspect.race() == Race::Black will return true/false for the containing expression.

[-] Barx@hexbear.net 18 points 1 month ago

Nerds with a rudimentary understanding of undergrad stats do this all the time with extra steps by just building a simplistic model based on (racist) "crime data". Sometimes literally just a basic Bayesian model.

And they get hired by Palantir to do versions of that for $300k/year.

[-] Lussy@hexbear.net 30 points 1 month ago* (last edited 1 month ago)

us-foreign-policy

Wow how innovative

[-] Feline@hexbear.net 30 points 1 month ago

U Chicago continuing its proud reactionary legacy https://en.wikipedia.org/wiki/Chicago_Boys

[-] JoeByeThen@hexbear.net 25 points 1 month ago

Insider trading is probably very predictable, with enough data.

[-] SorosFootSoldier@hexbear.net 24 points 1 month ago

The Torment Nexus is only when you do 1984, not Minority Report.

[-] glans@hexbear.net 21 points 1 month ago

Predicts police behaviour not crime. And who can't do that.

[-] DragonBallZinn@hexbear.net 20 points 1 month ago

Can’t wait until in “freedomland” I get arrested not because I commit any crimes, but because I look like someone who might.

“Red always sus” but in real life.

[-] GalaxyBrain@hexbear.net 12 points 1 month ago

That's been happening for a really really long time already. It's called racism, now they're teaching it to computers.

[-] Wolfman86@hexbear.net 5 points 1 month ago

Didn’t it come out early on that AI was racist?

It didn't really “come out”. It was always known that garbage in leads to garbage out, and that models will reflect their training data. No serious researcher was surprised to learn that models reflect the biases of their training data, because that's part of the design.

[-] Evilphd666@hexbear.net 14 points 1 month ago

Does it poop out cute little billard balls too?

[-] TheDoctor@hexbear.net 12 points 1 month ago

We do not focus on predicting individual behavior, and do not suggest that anyone be charged with a crime that they didn’t commit, or be incarcerated for that. Our model learns from, and then predicts, event patterns in the urban space, for example, predicting that within a two block radius around the intersection of 67th Street and SW Avenue, there is a high risk of a homicide a week from now. It does not indicate who is going to be the victim or the perpetrator.

We found that when stressed, the law enforcement response is seemingly different in high socio-economic-status (SES) areas compared to their more disadvantaged neighboring communities. It is suggested in the paper, that when crime rates spike, the higher SES neighborhoods tend to get more attention at the cost of resources drawn away from poorer neighborhoods.

link

[-] Philosoraptor@hexbear.net 17 points 1 month ago

We found that when stressed, the law enforcement response is seemingly different in high socio-economic-status (SES) areas compared to their more disadvantaged neighboring communities. shocked-pikachu

this post was submitted on 03 Dec 2024
142 points (100.0% liked)

chapotraphouse

13633 readers
589 users here now

Banned? DM Wmill to appeal.

No anti-nautilism posts. See: Eco-fascism Primer

Slop posts go in c/slop. Don't post low-hanging fruit here.

founded 4 years ago
MODERATORS