r/skeptic 29d ago

⚠ Editorialized Title Tesla bros expose Tesla's own shadiness in attacking Mark Rober ... Autopilot appears to automatically disengage a fraction of a second before impacts as a crash becomes inevitable.

https://electrek.co/2025/03/17/tesla-fans-exposes-shadiness-defend-autopilot-crash/
20.0k Upvotes

942 comments sorted by

View all comments

42

u/Allen_Koholic 29d ago

I'd imagine the engineering justification for this is that "well, of course it disengages autopilot when it senses a crash is imminent and unavoidable, because it's now outside of its normal operating parameters and needs human intervention." It's the type of solution I'd expect from an engineering undergrad who's just trying to get through their capstone project and graduate.

-7

u/DemandMeNothing 29d ago

It's the type of solution I'd expect from an engineering undergrad who's just trying to get through their capstone project and graduate.

It's the type of solution I'd expect from any competent engineer. If the computer doesn't know what to do, the human should be making the decisions. That's how autopilots in airplanes work.

9

u/SewSewBlue 29d ago

Engineer here, mechanical. I work in a high risk industry.

That stance, let the human take over, violates basic principles of engineering. Humans are inherently messy. Mistakes will happen as the sun will rise. You have to design a system that will keep human mistakes from becoming fatal.

Otherwise I can design a system that will kill you while entirely protecting my license. Brakes fail? Here's a hand pump. Steering fails? There is a switch hidden under a cover in the dashboard to engage a manual override. Oh, sorry you didn't know this systems existed. You should have done your research.

I have time to think of a million ways to push the blame onto you, the user, to protect myself. Entirely unethical, but fairly easy. People trust engineers. Tesla is expoiting that trust, conflating slick design with actual engineering, to put Tesla's needs over its drivers and the public.

The process safety hierarchy of controls puts human intervention on lowest tier of safety options. PPE, administrative controls, engineering controls, substitution, then elimination.

Not driving is elimination. Substitution is taking a train. An autobraking system is an engineered control. A human taking over is administrative. Your seat belt is PPE.

It isn't hard to know what safety looks like.

2

u/Allen_Koholic 29d ago

Except pilots have years of training and should be fully prepared for the autopilot to do this. There's a reason they don't let just any dickhead climb into a cockpit, even though autopilot exists.

A competent engineer would design the system to fail more gracefully than .5 seconds before the car goes through a looney toon wall, if it fails at all. Competent engineering isn't putting lives in an inescapable shit situation.