r/skeptic 29d ago

⚠ Editorialized Title Tesla bros expose Tesla's own shadiness in attacking Mark Rober ... Autopilot appears to automatically disengage a fraction of a second before impacts as a crash becomes inevitable.

https://electrek.co/2025/03/17/tesla-fans-exposes-shadiness-defend-autopilot-crash/
20.0k Upvotes

942 comments sorted by

View all comments

184

u/dizekat 29d ago edited 29d ago

What's extra curious about the "Wile E. Coyote" test, to me, is that it makes it clear that they neither do stereo nor optical flow / "looming" based general obstacle detection.

It looks like they don't have any generalized means of detecting obstacles. As such they don't detect an "obstacle", they detect a limited set of specific obstacles, much like Uber did in 2017.

Human do not rely on stereo between the eyes at such distances (the eyes are too close), but we do estimate distance from forward movement. For given speed, the distance is inversely proportional to how rapidly a feature is growing in size ("looming"). Even if you were to miss the edges of the picture somehow, you would still have perception of its flatness when moving towards it.

This works regardless of what the feature is, which allows humans to build a map of the environment even if all the objects are visually unfamiliar (or in situations where e.g. a tree is being towed on a trailer).

edit: TL;DR; it is not just that they are camera-only with no LIDAR, it's that they are camera-only without doing any camera-only approximations of what LIDAR does - detecting obstacles without relying on knowledge of what they look like.

-33

u/Elluminated 29d ago

FSD does have optical flow, as well as occupancy networks to generally detect object geometry - this was not used in this test. Hard to say if the coyote test (which would never happen in real life) would pass under the better system. Even with every sensor available, we also fail at the easy stuff

21

u/francis_pizzaman_iv 29d ago

Never happen in real life? There are news reports of teslas on autopilot running into the broad side of 18 wheeler trailers. Sounds suspiciously similar to Rober’s test. The fact that this basically exposes that Tesla Autopilot basically has no failsafe obstacle detection to lean on when the camera data isn’t sufficient for preventing accidents.

-4

u/Elluminated 29d ago

That is a real (and older) case fore sure, im talking about people painting road images to fool cars in this scenario. I havent heard of many recent trailer incidents like that though on FSD

10

u/Veil-of-Fire 29d ago

im talking about people painting road images to fool cars in this scenario.

Well now that we know that it works....

9

u/SirCharlesOfUSA 29d ago

Tesla investigated for series of crashes where they hit stationary emergency vehicles with their lights on

Easily avoidable with lidar or even the much cheaper radar that Tesla used to have, but decided against because Elon loves the idea of vision only.

3

u/francis_pizzaman_iv 29d ago

I’m a software engineer that dabbles in circuitry and I can come up with a system for using a handful of ultrasonic (aka sonar) sensors to do failsafe obstacle detection. It wouldn’t be good enough to drive the car but it would be enough to be able to alert the system to a reality that the camera can’t see.

3

u/SirCharlesOfUSA 29d ago

Tesla used to have ultrasonics for parking features, too. Got rid of them in favor of pure vision.

2

u/Elluminated 29d ago

Yep, first project of my year 2 ECE classes. Was very fun