385
submitted 1 year ago by L4s@lemmy.world to c/technology@lemmy.world

Police in England installed an AI camera system along a major road. It caught almost 300 drivers in its first 3 days.::An AI camera system installed along a major road in England caught 300 offenses in its first 3 days.There were 180 seat belt offenses and 117 mobile phone

you are viewing a single comment's thread
view the rest of the comments
[-] EndlessApollo@lemmy.world 48 points 1 year ago

ITT a bunch of people who have never read an ounce of sci fi (or got entirely the wrong message and think law being enforced by robots is a good thing)

[-] CrayonRosary@lemmy.world 18 points 1 year ago

Calling an image recognition system a robot enforcing the law is such a stretch you're going to pull a muscle.

[-] EndlessApollo@lemmy.world -2 points 1 year ago

It's going to disproportionately target minorities. ML* isn't some wonderful impartial observer, it's subject to all the same biases as the people who made it. Whether the people at the end of the process are impartial or not barely matters either imo, they're going to get the biased results of the ML looking for criminals so it's still going to be a flawed system even if the human element is OK. Ffs please don't support this kind of dystopian shit, Idk how it's not completely obvious how horrifying this stuff is

*what people call AI is not intelligent at all. It uses machine learning, the same process as chatbots and autocorrect. AI is a buzzword used by tech bros who are desperate to "invest in the future"

[-] echodot@feddit.uk 18 points 1 year ago* (last edited 1 year ago)

But the law isn't enforced by robots the law is enforced by humans. All that's happening here is that the process of capturing transgressions has been automated. I don't see how that's a problem.

As long as humans are still part of the sentencing process, and they are, then functionally there's no difference, if a mistake is being made it will be rectified at that time. From the process point of view there isn't really any difference between being caught by an automated AI camera and being caught by a traffic cop.

[-] davidalso@lemmy.world 6 points 1 year ago

Although completely reasonable, I fear that your conclusion is inaccessible for most folks.

And as a pedestrian, I'm all for a system that's capable of reducing distracted driving.

[-] afraid_of_zombies@lemmy.world 4 points 1 year ago

You have never had to dispute one of those tickets I assume.

Almost a decade ago I got one in the mail for a city that is about 9 hours away from my house. I am going thru the dispute process and being told repeatedly that "I am tired of people claiming that it wasn't them" with me suggesting that if their system worked they would most likely get fewer calls. Pure luck I noticed that the date is the exact date my daughter was born and thus the only way I could have been in that city is if I had somehow left my wife while she was in labor and managed to move my car 9 hours away. Once I pointed that out and that I could send them the birth certificate they gave up.

The problem with these systems is that they are trusted 100% and it becomes on the regular person to prove their innocence. Which is the exact opposite of what the relationship should be. If I get issued a ticket, it should be on the state to produce the evidence, not on me to get lucky.

[-] echodot@feddit.uk 1 points 1 year ago* (last edited 1 year ago)

If you read the article it makes it clear it wouldn't get that far.

It goes to human operator who looks at the picture and says whether or not they can actually see a violation on the image. So it wouldn't get as far as an official sanction so you wouldn't have to go through that process.

[-] afraid_of_zombies@lemmy.world 3 points 1 year ago

I am sorry am I talking to myself? I just gave you a literal example of this not working.

[-] EndlessApollo@lemmy.world 2 points 1 year ago

It's going to disproportionately target minorities. ML* isn't some wonderful impartial observer, it's subject to all the same biases as the people who made it. Whether the people at the end of the process are impartial or not barely matters either imo, they're going to get the biased results of the ML looking for criminals so it's still going to be a flawed system even if the human element is OK. Ffs please don't support this kind of dystopian shit, Idk how it's not completely obvious how horrifying this stuff is

*what people call AI is not intelligent at all. It uses machine learning, the same process as chatbots and autocorrect. AI is a buzzword used by tech bros who are desperate to "invest in the future"

[-] atzanteol@sh.itjust.works 12 points 1 year ago

According to Sci-fi organ transplants will lead to the creation of monsters who will kill us all for "tampering in God's domain."

Maybe fiction isn't the best way to determine policy...

[-] afraid_of_zombies@lemmy.world 2 points 1 year ago

Worked for Gattaca. GINI pretty much only exists because of it.

[-] RoyalEngineering@lemmy.world -3 points 1 year ago

Yeah!! Now they got them new fangled pooters in cars too!! No thank U Night Rider!!

this post was submitted on 21 Aug 2023
385 points (94.3% liked)

Technology

60062 readers
1432 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 2 years ago
MODERATORS