the funniest thing to me, is that this probably isn't even the fault of AI, this is probably the fault of software developers too lazy to actually write any semi decent code that would do a good job of (not) being a nuisance.
Most developers take pride in what they do and would love to build in all the best features for launch.
But that's not possible. There's a deadline and a finite budget for programmers. Ipso facto, a finite number of dev hours.
Where can I buy a traffic cone shaped rock?
I read RoboTaxis as RobotAxis and wondered what a mechanical version of the losers of WW2 had to do with cars
Sounds like a good SciFi
This is STUPID! I can't WAIT for President MUSK to ELIMINATE all these Pesky Rules preventing AI Cars from MOWING DOWN CHILDREN In Crosswalks!
I work in a related field to this, so I can try to guess at what's happening behind the scenes. Initially, most companies had very complicated non-machine learning algorithms (rule-based/hand-engineered) that solved the motion planning problem, i.e. how should a car move given its surroundings and its goal. This essentially means writing what is comparable to either a bunch of if-else statements, or a sort of weighted graph search (there are other ways, of course). This works well for say 95% of cases, but becomes exponentially harder to make work for the remaining 5% of cases (think drunk driver or similar rare or unusual events).
Solving the final 5% was where most turned to machine learning - they were already collecting driving data for training their perception and prediction models, so it's not difficult at all to just repurpose that data for motion planning.
So when you look at the two kinds of approaches, they have quite distinct advantages over each other. Hand engineered algorithms are very good at obeying rules - if you tell it to wait at a crosswalk or obey precedence at a stop sign, it will do that no matter what. They are not, however, great at situations where there is higher uncertainty/ambiguity. For example, a pedestrian starts crossing the road outside a crosswalk and waits at the median to allow you to pass before continuing on - it's quite difficult to come up with a one size fits all rule to cover these kinds of situations. Driving is a highly interactive behaviour (lane changes, yielding to pedestrians etc), and rule based methods don't do so well with this because there is little structure to this problem. Some machine learning based methods on the other hand are quite good at handling these kinds of uncertain situations, and Waymo has invested heavily in building these up. I'm guessing they're trained with a mixture of human-data + self-play (imitation learning and reinforcement learning), so they may learn some odd/undesirable behaviors. The problem with machine learning models is that they are ultimately a strong heuristic that cannot be trusted to produce a 100% correct answer.
I'm guessing that the way Waymo trains its motion planning model/bias in the data allows it to find some sort of exploit that makes it drive through crosswalks. Usually this kind of thing is solved by creating a hybrid system - a machine learning system underneath, with a rule based system on top as a guard rail.
Some references:
(Apologies for the very long comment, probably the longest one I've ever left)
Anything to keep the car companies stock price up
It is an offense in Japan to not stop if someone is waiting before entering the crosswalk (and technically to progress until they are fully off the entire street, though I've had assholes whip around me for not breaking the law). People do get ticketed for it (though not enough, honestly). I wonder what they would do here.
Is this site being weird or am I tripping, because I just came in here, and there was an on point comment about being a parent and wiping pee being a part of life, and ended with a solid joke, but, I come back in here and it’s gone with no deleted or anything. It was good and on point enough that I returned to reply…. What is happening? I’m too tired for this confusion!
Are the developers Swedish?
Here we go again, blaming robots for doing the same thing humans do. Only at least the robots don't flip you off when they try to run you over.
Humans aren't programmed to break the law.
Fuck Cars
A place to discuss problems of car centric infrastructure or how it hurts us all. Let's explore the bad world of Cars!
Rules
1. Be Civil
You may not agree on ideas, but please do not be needlessly rude or insulting to other people in this community.
2. No hate speech
Don't discriminate or disparage people on the basis of sex, gender, race, ethnicity, nationality, religion, or sexuality.
3. Don't harass people
Don't follow people you disagree with into multiple threads or into PMs to insult, disparage, or otherwise attack them. And certainly don't doxx any non-public figures.
4. Stay on topic
This community is about cars, their externalities in society, car-dependency, and solutions to these.
5. No reposts
Do not repost content that has already been posted in this community.
Moderator discretion will be used to judge reports with regard to the above rules.
Posting Guidelines
In the absence of a flair system on lemmy yet, let’s try to make it easier to scan through posts by type in here by using tags:
- [meta] for discussions/suggestions about this community itself
- [article] for news articles
- [blog] for any blog-style content
- [video] for video resources
- [academic] for academic studies and sources
- [discussion] for text post questions, rants, and/or discussions
- [meme] for memes
- [image] for any non-meme images
- [misc] for anything that doesn’t fall cleanly into any of the other categories