424
Womp womp (lemmy.world)
submitted 4 months ago by PugJesus@lemmy.world to c/memes@lemmy.world
you are viewing a single comment's thread
view the rest of the comments
[-] gigachad@sh.itjust.works 3 points 4 months ago

It's weird, an optical sensor should fall for this, but LiDAR detects objects in 3D.

[-] FauxLiving@lemmy.world 13 points 4 months ago

Teslas famously don't use lidar because Musk declared that cameras were good enough. Reality disagrees, but reality owns no shares of Tesla.

[-] Cort@lemmy.world 5 points 4 months ago

Musk declared that cameras were good enough

And then disabled existing lidar sensors in teslas, so his team could just focus on camera vision only

[-] AtariDump@lemmy.world 3 points 4 months ago

He’s a dumbass

[-] RisingSwell@lemmy.dbzer0.com 2 points 4 months ago

I believe he disabled radar, I'm not sure tesla ever had lidar. Radar would solve this problem anyway.

[-] MountingSuspicion@reddthat.com 4 points 4 months ago

Yea, to my understanding Tesla does not use LiDAR and that's the issue. They just use cameras I think. Other people posted videos in this thread: https://m.youtube.com/watch?v=IQJL3htsDyQ

I can't be bothered to care/watch anything Tesla related, but apparently the lack of lidar is the thing being tested.

[-] Ledericas@lemm.ee 4 points 4 months ago

Elon musk dint want lidar because it cost too much, any tesla before 2018 had it, but it introduced update to brick all off them.

[-] BossDj@lemm.ee 4 points 4 months ago

The first ten minutes is him "sneaking" in a small lidar unit to Disneyland be using it to make 3d models of the ride path. That's pretty fun.

[-] MountingSuspicion@reddthat.com 4 points 4 months ago

Thanks for letting me know. I watched that and ended up watching the rest of the video. It was actually a pretty fun watch.

A decent camera only vision system should still be able to detect the wall. I was actually shocked at the fact that Tesla failed this test so egregiously.

If you use two side by side cameras you can determine distance to a feature by calculating the offset in position of the feature between the two camera images. I had always assumed this was how Tesla planned to achieve camera only FSD, but that would make too much sense.

https://www.intelrealsense.com/stereo-depth-vision-basics/

Even if they wanted to avoid any redundant hardware and only go with one camera for each direction, there is still a chance they could've avoided this kind of issue if they used structure through motion, but that's much harder to do if the objects could be moving.

https://en.m.wikipedia.org/wiki/Structure_from_motion

this post was submitted on 23 Mar 2025
424 points (96.9% liked)

memes

16765 readers
453 users here now

Community rules

1. Be civilNo trolling, bigotry or other insulting / annoying behaviour

2. No politicsThis is non-politics community. For political memes please go to !politicalmemes@lemmy.world

3. No recent repostsCheck for reposts when posting a meme, you can only repost after 1 month

4. No botsNo bots without the express approval of the mods or the admins

5. No Spam/Ads/AI SlopNo advertisements or spam. This is an instance rule and the only way to live. We also consider AI slop to be spam in this community and is subject to removal.

A collection of some classic Lemmy memes for your enjoyment

Sister communities

founded 2 years ago
MODERATORS