If you want to motivate people to action, frame it in terms of the property damage they’ll experience to their car when it hits a child. We’ve already seen how far the American public is willing to go for children’s lives, and it’s not very far at all.
AI: %1 chance human, keep going like nothing happened
Full speed in the dark, I think most people would failed to avoid that. What's concerning is it does not stop afterwards
Note that part of the discussion is we shouldn't settle for human limitations when we don't have to. Notably things like LIDAR are considered to give these systems superhuman vision. However, Tesla said 'eyes are good enough for folks, so just cameras'.
The rest of the industry said LIDAR is important and focus on trying to make it more practical.
Hell, even not having lidar The thing was pretty clearly a large road obstacle a second and a half out. They had a whole left lane open At least enough time to do a significant speed reduction.
The rest of the industry said LIDAR is important and focus on trying to make it more practical.
Volvo is using LIDAR. I trust them way more than Tesla when it comes to something pertaining to safety.
Isn't Elon advertising AI as orders of magnitudes better reaction time and much less error prone than a human though...
Remember when they removed ultrasonic and radar sensors in favor of "Tesla Vision"? That decision demonstrably cost people their lives and yet older, proven tech continues to be eschewed in favor of the cutting edge new shiny.
I'm all for pushing the envelope when it comes to advancements in technology and AI in its many forms, but those of us that don't buy Teslas never signed up to volunteer our lives as training data for FSD.
Didn't stop afterwards, didn't even attempt to brake
I think the LIDAR and other sensors are supposed to be IR and see in the dark.
Sensors that the Tesla famously doesn't have (afaik, didn't check) because Elon is a dumbass.
The poster, who pays Tesla CEO Elon Musk for a subscription to the increasingly far-right social media site, claimed that the FSD software “works awesome” and that a deer in the road is an “edge case.” One might argue that edge cases are actually very important parts of any claimed autonomy suite, given how drivers check out when they feel the car is doing the work, but this owner remains “insanely grateful” to Tesla regardless.
How are these people always such pathetic suckers.
I grew up in Maine. Deer in the road isn’t an edge case there. It’s more like a nightly occurrence.
Fences alongside the road and special animal crossings are unfeasible with US roads length, yes?...
I've read that they do that ... somewhere.
Same in Kansas. Was in a car that hit one in the 80s and see them often enough that I had to avoid one that was crossing a busy interstste highway last week.
Deer are the opposite of an edge case in the majority of the US.
Deer aren’t edge cases. If you are in a rural community or the suburbs, deer are a daily way of life.
As more and more of their forests are destroyed, deer are a daily part of city life. I live in the middle of a large midwestern city; in neighborhood with houses crowded together. I see deer in my lawn regularly.
- Vehicle needed lidar
- Vehicle should have a collision detection indicator for anomalous collisions and random mechanical problems
I notice nobody has commented on the fact that the driver should've reacted to the deer. It's not Tesla's responsibility to emergency brake, even if that is a feature in the system. Drivers are responsible for their vehicle's movements at the end of the day.
Then it's not "Full self driving". It's at best lane assistance, but I wouldn't trust that either.
Elon needs to shut the fuck up about self driving and maybe issue a full recall, because he's going to get people killed.
True but if Tesla keeps acting like they're on the verge of an unsupervised, steering wheel-free system...this is more evidence that they're not. I doubt we'll see a cybercab with no controls for the next 10 years if the current tech is still ignoring large, highly predictable objects in the road.
Only keeping the regular cameras was a genius move to hold back their full autonomy plans
Friendly reminder that tesla auto pilot is an AI training on live data. If it hasn't seen something enough times then it won't know to stop. This is how you have a tesla running full speed into an overturned semi and many, many other accidents.
So, a kid on a bicycle or scooter is an edge case? Fuck the Muskrat and strip him of US citizenship for illegally working in the USA. Another question. WTF was the driver doing?
Driving is full of edge cases. Humans are also bad drivers who get edge cases wrong all the time.
The real question isn't is Tesla better/worse in anyone in particular, but overall how does Tesla compare. If a Tesla is better in some situations and worse in others and so overall just as bad as a human I can accept it. Is Tesla is overall worse then they shouldn't be driving at all (If they can identify those situations they can stop and make a human take over). If a Tesla is overall better then I'll accept a few edge cases where they are worse.
Tesla claims overall they are better, but they may not be telling the truth. One would think regulators have data for the above - but they are not talking about it.
Yeah there are edge cases in all directions.
When people want to say that someone is very rare they should say “corner case,” but this doesn’t seem to have made it out of QA lingo and into the popular lexicon.
Technology
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed