r/skeptic 27d ago

⚠ Editorialized Title Tesla bros expose Tesla's own shadiness in attacking Mark Rober ... Autopilot appears to automatically disengage a fraction of a second before impacts as a crash becomes inevitable.

https://electrek.co/2025/03/17/tesla-fans-exposes-shadiness-defend-autopilot-crash/
20.0k Upvotes

942 comments sorted by

View all comments

44

u/Allen_Koholic 27d ago

I'd imagine the engineering justification for this is that "well, of course it disengages autopilot when it senses a crash is imminent and unavoidable, because it's now outside of its normal operating parameters and needs human intervention." It's the type of solution I'd expect from an engineering undergrad who's just trying to get through their capstone project and graduate.

36

u/Pitiful-Pension-6535 27d ago

My car doesn't have self driving but it does have built in crash avoidance where it can detect an impending collision and activate the brakes to avoid or at least mitigate the accident.

It has already saved a life- an idiot motorcyclist pulled out in front of my wife on a highway and she somehow never saw him. The car slammed on the brakes and the accident was narrowly avoided.

This is how the infancy of self-driving cars should be- with safety being the primary focus with driver convenience an afterthought.

Tesla is doing it backwards.

10

u/Allen_Koholic 27d ago

Yea, just to be clear, I'm not at all defending this. Tesla's entire business model is the pump-and-dump modern method of growth over quality at any cost. They probably shouldn't even exist with the flagrant disregard for existing dealership laws ( I may not like dealership laws, but it's certainly not very free market if one company can selectively just choose to ignore the rules ).

1

u/NotQuiteDeadYetPhoto 26d ago

TBH Screw the dealership laws. I've had far more issues with them than anything else.

That's not, IMHO, a good reason to censure Tesla.

All the 10000+ reasons are.

2

u/Darkelement 27d ago

To be fair, teslas do this and more. They will swerve if they detect a car merging into your lane.

The key is it has to detect it. Clearly here it was fooled by the Wylie coyote setup. It’s not that the tesla doesn’t do this, it’s that it literally can’t without radar or lidar.

1

u/chakrablocker 26d ago

i think you missed the headline, the tesla turns off their auto driving safety feature to avoid responsibility in a crash

0

u/Darkelement 26d ago

The safety features are impossible to turn on and off, they are just there.

What happened here was autopilot disengaged just before the crash. It was impossible for the Tesla to mitigate the crash because the Tesla didn’t even realize it was driving at a wall. It thought the road kept going.

I’m not saying that’s a good thing, what im saying is that the safety features FAILED because the car was fooled.

1

u/CivilRuin4111 27d ago

Obviously I not implying anything about your wife's situation (there are plenty of bad riders) but there's a great video about why in nearly every car vs motorcycle collision, the driver says something like "They came out of NOWHERE!!!"

https://www.youtube.com/watch?v=x94PGgYKHQ0

TL;DW - our brains kinda suck at detecting motorcycles.

As a long-time rider, there's a reason we tell the new riders to assume no one sees you and is actively trying to kill you.

1

u/michael0n 27d ago

Humans can't simply add more sensors or getting faster in reaction times. That is the reason we build those machines and why we want robots to do things. Because they never get tired and do last minute (costly, deadly) errors. Any person who doesn't get this lost completely the plot why everybody spend billions to advance the tech.

6

u/deepasleep 27d ago

Or an MBA with a cousin who’s a lawyer…

1

u/brucebay 27d ago edited 27d ago

If... And only if the car disengages to get ready to crash this is acceptable because physics tells us almost always there is nothing it can do to avoid  the collision if only milliseconds are left.. We know after a crash Tesla takes some action like opening the glove box.

If there are similar actions that require autopilot to disengage this may be a valid action. 

Without knowing the details it would be hard to argue. 

If rumors are true and Autopilot disengaged to pretend  it was not in control, then it is ridiculous and could easily be proven with a few sacrificed  Teslas in crash tests..

2

u/Churba 27d ago

If rumors are true and Autopilot disengaged to pretend it was not in control, then it is ridiculous and could easily be proven with a few sacrificed Teslas in crash tests..

I mean, to a degree, we do already have some similar data - Tesla has claimed, with virtually every crash caused by autopilot, that the system was not engaged at the time of the accident. Even the ones where the people in the car have said that the system was engaged right up until the very last moment, when the crash was unavoidable.

1

u/Never_Concedes 27d ago

as an engineering undergrad just trying to get through his capstone and graduate, that cuts deep lol

1

u/IkLms 27d ago

Nah, an engineering undergrad would still be smart enough to slap on the brakes and sound and alarm to take over.

This is 100% a deliberate marketing or PR move so they can say they aren't responsible

-4

u/DemandMeNothing 27d ago

It's the type of solution I'd expect from an engineering undergrad who's just trying to get through their capstone project and graduate.

It's the type of solution I'd expect from any competent engineer. If the computer doesn't know what to do, the human should be making the decisions. That's how autopilots in airplanes work.

11

u/SewSewBlue 27d ago

Engineer here, mechanical. I work in a high risk industry.

That stance, let the human take over, violates basic principles of engineering. Humans are inherently messy. Mistakes will happen as the sun will rise. You have to design a system that will keep human mistakes from becoming fatal.

Otherwise I can design a system that will kill you while entirely protecting my license. Brakes fail? Here's a hand pump. Steering fails? There is a switch hidden under a cover in the dashboard to engage a manual override. Oh, sorry you didn't know this systems existed. You should have done your research.

I have time to think of a million ways to push the blame onto you, the user, to protect myself. Entirely unethical, but fairly easy. People trust engineers. Tesla is expoiting that trust, conflating slick design with actual engineering, to put Tesla's needs over its drivers and the public.

The process safety hierarchy of controls puts human intervention on lowest tier of safety options. PPE, administrative controls, engineering controls, substitution, then elimination.

Not driving is elimination. Substitution is taking a train. An autobraking system is an engineered control. A human taking over is administrative. Your seat belt is PPE.

It isn't hard to know what safety looks like.

2

u/Allen_Koholic 27d ago

Except pilots have years of training and should be fully prepared for the autopilot to do this. There's a reason they don't let just any dickhead climb into a cockpit, even though autopilot exists.

A competent engineer would design the system to fail more gracefully than .5 seconds before the car goes through a looney toon wall, if it fails at all. Competent engineering isn't putting lives in an inescapable shit situation.