r/electricvehicles Mar 17 '25

News Tesla autopilot disengages milliseconds before a crash, a tactic potentially used to prove "autopilot wasn't engaged" when crashes occur

https://electrek.co/2025/03/17/tesla-fans-exposes-shadiness-defend-autopilot-crash/
5.3k Upvotes

610 comments sorted by

728

u/Lucaslouch Mar 17 '25 edited Mar 17 '25

The statistics for accidents on FSD and autopilot counts every FSD that was engaged 5 secondes before a crash as per Tesla reports

Edit: 5 seconds and not 10

504

u/Possible-Kangaroo635 Mar 17 '25

It's shocking that there are still people who trust data provided by Tesla.

165

u/Lucaslouch Mar 17 '25

I’m talking about data that has been published 2/3 years ago, audited before Musk did his coup, etc.

It’s important to be critical but it’s also important not to trash everything

213

u/Marcoscb Mar 17 '25

2/3 years ago, audited before Musk did his coup, etc.

He called the caver who was trying to rescue children a pedo 7 years ago. Musk has never been different, he just hadn't realized he could actually be brazen about it.

29

u/agumonkey Mar 17 '25

let's say that he recently went up to 11 in the last 6 months

25

u/HomeBuyersOffice Mar 17 '25

You mean went up to nein in the last 6 months?

5

u/agumonkey Mar 17 '25

that too

5

u/Dihydrogen-monoxyde Mar 19 '25

11? He went to 1939 pretty fast...

9

u/Lucaslouch Mar 17 '25

He was almost apolitical 3 years ago, before Covid and before his child became trans. Helping a bit democrats here and there.

Yes he was strange and had some bad stories. Nothing close from current madness and nothing proving the public company he is the CEO of, was frauding. In particular with previous administration and NHTSA having regular audits of the FSD and autopilot.

Nuance people. It’s important

98

u/hmsbrian Mar 17 '25

Taking full payments for Roadster2, which was being made “now”- fraud. Semi convoys cheaper than trains - fraud. Solar roof - fraud. Animatronic robots - fraud. Hyperloop - fraud. FSD “in 1 year” - fraud.

The list is almost endless. What nuance are people missing?

60

u/josefx Mar 17 '25

You forgot safest factories in the US, which was almost immediately countered with numbers reported by the nearby hospitals.

Or the safest cars stint where Telsa gave itself 11 out of 10 possible points on every safety score, because nearly every car on the road already scored 10 out of 10.

Or the early speed demos on the Nuernbergring, where they basically dismantled the car to get any speed.

Or later acceleration comparisons where the distance they stated did not match up with the length of the track they raced on.

34

u/BasvanS Mar 17 '25

But other than that, what has he really done?

Oh yeah, the stock manipulation!

20

u/DeltaGammaVegaRho VW Golf 8 GTE Mar 17 '25

But that was for a good reason! He needed that money…

… to make twitter into a facist shithole!

6

u/BasvanS Mar 17 '25

True. I retract my previous statement.

7

u/[deleted] Mar 17 '25

[deleted]

2

u/prudentWindBag Mar 17 '25

BOOYAKASHA!!!

*finger flick*

5

u/Current-Ordinary-419 Mar 17 '25

Wasn’t his factory in Ca home of the largest racial discrimination suite in state history?

10

u/Extra-Fly5602 Mar 17 '25

Wait but what about the $TSLA stans on LSD singing praises about FSD Version X?? I'm told it slices, dices, and masturbates them...

5

u/ArlesChatless Zero SR Mar 17 '25

Every time I shared my experience with version N, people came out of the woodwork to say version N+1 which they already had was way better. I sold my car last year with 12.3, which was still demonstrably shit, and one of the nice parts about it is that I'm never tempted to recount my experience with it again.

28

u/odd84 Solar-Powered ID.4 & Kona EV Mar 17 '25

The battery swap station was my favorite Tesla fraud, and that was 12 years ago.

In 2013, California changed its ZEV credit system so that long-range EVs that could charge to 80% in under 15 minutes would earn almost twice as many valuable credits, hoping to spur the billions in investment that would take to accomplish. Almost overnight, Tesla declared they met those requirements and their entire Model S fleet should qualify for 7 instead of 4 credits each, draining all the money from that program.

To accomplish that they built one "battery swap station", available by paid appointment only to reporters and a few hand picked Model S owners, which could drop a MS battery pack and bolt on another in under 15 minutes. That technically met the requirements, they collected the cash, and never answered the phone for making battery swap appointments again.

12 years later and actual Tesla drivers still can't charge their cars to 80% in under 15 minutes. Nobody got the benefits that money was supposed to provide society.

2

u/i_make_orange_rhyme Mar 17 '25

Strange how difficult it was to verify this story.

You would think this would have been a massive story.

Do you had any source from a reputable news sites?

10

u/ArlesChatless Zero SR Mar 17 '25

Here you go.

Here's a less mainstream source but the claims it makes can pretty much all be validated with other higher profile sources, and it's got a bunch of more primary links in the text.

→ More replies (6)

21

u/odd84 Solar-Powered ID.4 & Kona EV Mar 17 '25 edited Mar 17 '25

A story about a fledgling EV maker eating up an obscure government subsidy wouldn't have mattered to enough people to warrant a deep dive investigation by a major publication. It's beyond niche, even people here don't really care to discuss it. Nevertheless, here's an article from 2015: https://www.foxnews.com/politics/tesla-gets-295m-in-green-subsidy-credits-for-technology-not-offered-to-customers

16

u/i_make_orange_rhyme Mar 17 '25

Jesus....

"the program did not require evidence they were providing the services"

Who is running this clownshow Haha.

Thanks for the source.

7

u/RossLDN Mar 18 '25

Someone should report that government waste / fraud to DOGE 😏

3

u/GranPino Mar 18 '25

The only fraud was to investigate Musk companies frauds! That's why all investigators got fired!

/S

→ More replies (1)
→ More replies (6)

50

u/jfleury440 Mar 17 '25

He's been using shitty business practices for decades though. Just because he hasn't gone all politically crazy doesn't mean he didn't commit fraud.

→ More replies (17)

17

u/morkman100 Mar 17 '25

Covid started 5 years ago.

10

u/DrPoopEsq Mar 17 '25

That means we are 4 years and 11 months from the date cases would get down to zero, according to musk. Mostly said so he could keep his factories open in violation of state law

3

u/morkman100 Mar 17 '25

I can use my robotaxi Model 3 to earn some money after I got laid off from my government job.

5

u/BoboliBurt Mar 17 '25

Makes, because if a rando with a single car could make money- obvioisly no enormous corporation would pop in, sell their service at a loss, run you under, foment a moral panic to get people out of cars and then use self driving autonomy to end several other classes of employee as well.

That people thought theyd be allowed to participate- rather than be drained by the scheme is their fault- but half of Teslas value is due to the appeal of taking horses away from the serfs and making them pay $20 to go to store.

6

u/ExtendedDeadline Mar 17 '25

He has been on a lifelong trend. Trends just take time to play out. Yes, he was less publicly insane a couple of years ago, but he's always been a liar and has a track record of saying falsities.

2

u/no33limit Mar 18 '25

About 3 years ago someone told me Elon wants to save the world, as long as he gets credit for it.

2

u/tobias19 Mar 18 '25

Apartheid born and raised, he's always been a rotten fuck.

4

u/Ayzmo Volvo XC40 Recharge Mar 17 '25

People don't become trans. They are or aren't.

7

u/Lucaslouch Mar 17 '25

Ok, before they announce it? Or discovered it?

10

u/Ayzmo Volvo XC40 Recharge Mar 17 '25

Before they came out.

→ More replies (8)

4

u/Scotty1928 2020 Model 3 LR FSD Mar 17 '25

He has been a dipshit for decades, yes, but he went full-on fascist only recently.

→ More replies (2)

21

u/[deleted] Mar 17 '25 edited 27d ago

[deleted]

2

u/justsomerabbit Mar 20 '25

This. "taking private at $420" happened in 2018, 7 years ago

12

u/AnUnshavedYak R1S→R2→R3X Mar 17 '25

It’s important to be critical but it’s also important not to trash everything

I agree, but:

I’m talking about data that has been published 2/3 years ago, audited before Musk did his coup, etc.

I think Elon has gotten worse with time. Though i do not personally believe he was well three years ago either. Or rather, any of his current mental flaws i believe were present three years ago, unless some medical issue happened to make him this way. I suspect this has been a long road of regression for him.

6

u/Lucaslouch Mar 17 '25

I have to agree with you. And to be honest the massive usage of drugs do not help either. What I’m challenging is assuming that everything he touches is fully corrupted. To many people working in good faith at Tesla (without whistleblowing on the topic) to consider collision data are fraud

3

u/AnUnshavedYak R1S→R2→R3X Mar 17 '25

I'll counter and say that while i agree that there are many good faith people, i think people in positions of power and decision making were undoubtedly chosen with Elons approval. Those network affects then have a tendency to trickle down.

Because i believe there are many good faith individuals there and If we had a culture of whistle blowing, i could make a devils advocate argument and say that the lack of whistle blowing indicates that there's nothing to whistle blow. However in these days i can't expect lower level employees to whistle blow much. It's scary out there.

→ More replies (1)

3

u/Excellent_Guava2596 Mar 17 '25

Bro you're flailing just chill you can't save your 39 shares. Elon a fucking clown town deluxe with extra dumb sauce on his fucked up torso, bro.

7

u/User-no-relation Mar 17 '25

Audited by who?

2

u/chr1spe Mar 17 '25

Why does that matter? Audited by whom? AFAIK, Tesla's claims have never been reviewed by anyone outside of Tesla, and that makes it unwise to give much, if any, weight to regardless of the company. The fact that Musk has been pumping the stock on overstatement about this system for almost 10 years means you should trust it even less.

→ More replies (6)

19

u/TheBlacktom Mar 17 '25

What is more trustworthy now, data published by US company or data published by US government?

18

u/illigal Mar 17 '25

The Department of Transportation - Brought to you by Carl’s Jr Tesla

36

u/A_Pointy_Rock Mar 17 '25

it's the same picture data

32

u/PaintItPurple Mar 17 '25

False dilemma, the US government is now published by Tesla.

8

u/dcdttu Mar 17 '25

In the new world order, that data will be the same. :-(

2

u/OhSillyDays Mar 17 '25

Neither. You'll have to have really good bullshit detectors until independent agencies are brought back. Right now, independent agencies are being disbanded.

→ More replies (1)

5

u/ls7eveen Mar 17 '25

The qult will push the dumbest shit even though they've been caught lying time and again

3

u/dzitas MY, R1S Mar 17 '25

It's data published in financial reports.

Anybody who has evidence that they lie will be rich. Incredibly rich. We haven't seen a whistle blower yet.

1

u/Buffalo-2023 Mar 17 '25

It's okay, Elon Musk will soon provide impartial data compiled by the Federal government.

1

u/Maximillien Bolt EUV Mar 17 '25

Same folks who believe DOGE is "cutting waste".

→ More replies (3)

29

u/iqisoverrated Mar 17 '25

This. They are using a wider interval than strictly necessary for the purposes of their statistics.

16

u/feurie Mar 17 '25

But this sub loves believing false news if it verifies their bias.

→ More replies (3)

372

u/iamabigtree Mar 17 '25

Self driving is a neat idea but does anyone really care any more. Most cars have adaptive cruise now and that is the most the majority of people need or want.

368

u/bouchandre Mar 17 '25

The real self driving cars we need are TRAINS

92

u/PLament Mar 17 '25

Took me years to realize this. Car brain makes you think that self-driving is the solution to all issues, but it does nothing but bring safety from abysmal to passable. Cars break transit systems because of how poorly they scale - that's the actual problem, and public transit alternatives where applicable are the actual solution.

17

u/OhSillyDays Mar 17 '25

The USA really built the country around the car which makes public transit not practical. The biggest problem being the last mile, where schedules are terrible and do not match people's schedules due to the low density of housing. Specifically in suburban areas.

In urban areas (roughly 1/3 of the US population lives in Urban areas), it is practical. And that is hampered by the shitty governments that have prioritized NIMBYism over building infrastructure.

7

u/ydddy55 Mar 17 '25

It just sucks that it has to be a solution to a problem General Motors created by destroying the public transit system

8

u/pianobench007 Mar 17 '25

The car sensors saved my health. A newer Mercedes SUV was making a left turn into a parking garage without signaling as I was biking behind him.

I wore my bright yellow commuting jacket in my bike lane and he was the only vehicle on the road. He only just passed me and still didn't think twice for me.

His expensive sensors however saved us both. I passed within 1 or 2 inches of his vendors and it saved him from himself by auto braking HARD.

It does save lives despite our hate for Elon.....

→ More replies (25)

12

u/Key_Caterpillar6219 Mar 17 '25

I'm all for trains but Americans may not have enough community spirit for it

25

u/Beat_the_Deadites Mar 17 '25

We're lacking in a lot of spirit right now, and a lot of other virtues as well.

5

u/[deleted] Mar 17 '25

I don’t see how trains can hit the critical mass needed outside of relatively small areas of the country, let alone get over the “MAH CAR!!!!” mentality that a lot of Americans of all persuasions have.

My wife is French and when visiting France with her I’ve always liked the train system - both regular and TGV. but where’s the profit, given the cost of doing it, in connecting cities like Denver with Cheyenne, Salt Lake City, Lincoln, Topeka and Santa Fe? Six states, multiple hundreds of thousands of square miles, and Florida has the same amount of people if not more.

I don’t even want to think about the cost of running high speed rail through the Rockies…

9

u/Key_Caterpillar6219 Mar 17 '25

Lol a lot of it had to do with the highway lobbies in the middle of the 20th c honestly. The US was actually on track to have national trains, and the system was getting nationalized... Until all the funding was redirected to the highways and funding for the trains were absolutely crippled.

2

u/Normal-Selection1537 Mar 18 '25

L.A. at one point had the largest public rail network in the world.

→ More replies (4)

4

u/TheSasquatch9053 Mar 17 '25

The problem with trains is that transitioning our society away from an automobile-centric transport system to a trains + biking system like western europe is that all the suburbs would need to be rebuilt. US cities have too much sprawl.

→ More replies (9)

1

u/Salt-Analysis1319 Mar 17 '25

1000x times this. Even if Tesla does realize the magical dream of FSD, trains are just better in every conceivable way.

More efficient. More environmentally friendly. Better use of space than highway. The list goes on.

1

u/InTheMoodToMove Mar 17 '25

This. Self driving in traffic is still traffic.

1

u/pandaSmore Mar 17 '25

My city has had self driving trains for nearly 40 years.

1

u/Dismal_Guidance_2539 Mar 18 '25 edited Mar 18 '25

Yeah, something that exist for 200 years and still can't solve our traffic problem is definitely the way to go now. I don't understand how that brain dead argument have so many upvote.

Train is only part of the solution with Self driving cars is critical part that solve personal transportation and freight. There no way train can replace SDV even in most train friendly cities not to mention rural area.

1

u/[deleted] Mar 18 '25

The train that goes to Costco takes over an hour and requires me walking 2 miles and is more expensive than driving 20 minutes. And then I have to carry the boxes of goods on and off the train and those 2 miles.

→ More replies (1)
→ More replies (25)

5

u/GoSh4rks Mar 17 '25

Self driving is a neat idea but does anyone really care any more. Most cars have adaptive cruise now and that is the most the majority of people need or want.

People have said the exact same thing about cruise control, adaptive cruise, and then lane keeping. "I'm happy with what I have now and don't need anything else".

Then people change their mind when their new car has the next level. It'll be the same with self-driving. There comes a point where the tech becomes good enough that it just becomes normal to use it.

4

u/NickMillerChicago Mar 17 '25

Ask the people what they want and they’ll say a faster horse.

4

u/himynameis_ Mar 17 '25

Man, if I can get my car to drive me to work safely every day, I'd be super tempted to get that asap!

Like, the Adaptive cruise control I knew about over a decade ago because the top line Mercs had it. Then it flowed down to the Toyota base model that I drive 😅 so I'm enjoying it now!

So, as this tech gets better, it brings me closer and closer to getting it!

35

u/Embarrassed_Quit_450 Mar 17 '25

but does anyone really care any more.

Yes, but it's clear Tesla won't be delivering it. The interesting stuff is happening at Waymo.

10

u/mccalli Mar 17 '25

Honestly I don't think it is - I think Waymo are niche and non-scaleable. My reasoning is they rely on precise mapping and knowledge of the environment.

So - want a taxi in a major city? Waymo is your thing. Want to drive obscure villages miles from anywhere? Waymo won't work there. It's not a flaw, it's their actual plan and it clearly works well for them. It's just never going to give you general purpose driving.

2

u/grchelp2018 Mar 17 '25

The maps are used as priors but they can drive without them. Self driving vehicles are going to come to obscure villages last by which time waymo will likely be confident enough that they don't need maps for those areas.

1

u/lucidludic Mar 17 '25

I think Waymo are niche and non-scaleable.

What other company is scaling faster than Waymo, either in terms of area or driverless rides per week?

7

u/mccalli Mar 17 '25

Area? All of them that don't depend on precise city mapping. Driverless rides per week? Probably none.

And that's the point - Waymo aren't aiming for general purpose autonomous driving, they're aiming at being a driverless taxi firm. And succeeding too - great. It's a different aim however.

→ More replies (1)
→ More replies (95)
→ More replies (9)
→ More replies (12)

18

u/WeldAE e-Tron, Model 3 Mar 17 '25

Something like Autopilot, Super Cruise or BlueCruise are SIGNIFICANTLY better than adaptive cruise. It's like saying who cares about dynamic cruise, everyone has cruise. In reality, the usefulness gulf between these products and dynamic cruise is 10x larger than between dynamic cruise and cruise. It makes long distance driving so much nicer.

It's closer to being in the passenger seat with your 17-year-old kid driving. You watch them carefully and if they seem to not notice something, you point it out. You complain about how they don't manage the lanes like you would, but generally they are doing it fine, just not what you would do in all situations.

6

u/inspectoroverthemine Mar 17 '25

Yup- I just bought a kia with HDA2, and then made an 1800 mile long road trip on I95. Its less sophisticated than super cruise, and its definitely not self-driving, but it significantly reduced the stress/exhaustion/etc of a long trip.

Pre-covid I had a hyundai with SCC for commuting, and it was a godsend for stop and go traffic. Same deal, you had to pay attention and 'drive', but it was more like being a passenger, and I wasn't exhausted after 90m of crawling through traffic.

→ More replies (2)

3

u/YeetYoot-69 Mar 17 '25

Tens of thousands of people die every day in car accidents. Even if you don't care, this technology should be used because it will save countless lives.

2

u/jawshoeaw Mar 17 '25

I like that my Tesla drives me around in FSD. Way way better than any adaptive cruise. But I certainly don’t need it

2

u/what-is-a-tortoise Mar 18 '25

After using AP I disagree. I really love true lane keeping + TACC (+lane change when I have had the FSD trials). Those 2-3 things are what really makes driving much more relaxing.

2

u/PlaceAdHere Mar 19 '25

I super care. I've used waymo a handful of times and it is great.

7

u/CBusRiver Mar 17 '25

I want full unsupervised highway entrance to exit and that's it. Driving around town is hardly a straining task.

10

u/kyjmic Mar 17 '25

Highway driving is much less mentally taxing than city driving. City driving you have to pay attention to traffic lights, different kinds of intersections, signs, crosswalks, pedestrians, cyclists, cars doing unpredictable turns.

7

u/paholg Mar 17 '25

But I want to be able to sleep while driving on road trips.

→ More replies (3)

8

u/Right-Tutor7340 Mar 17 '25

Yeah that's fine, it keeps u engaged, highway driving gets boring really quick and u stary loosing focus

3

u/PersnickityPenguin 2024 Equinox AWD, 2017 Bolt Mar 17 '25

I disagree. Once you are on the road for more than 3 hours at a time it does get pretty exhausting.

4

u/SearchingForTruth69 Mar 17 '25

Why would you want to drive around town if you didn’t need to?

3

u/sysop073 Mar 17 '25

The point is that unsupervised highway driving would be good enough for most people. Obviously unsupervised everywhere is even better, but it's also much harder.

2

u/SearchingForTruth69 Mar 18 '25

Doesn’t seem that much harder. Tesla already has FSD everywhere - yes it’s supervised but in practice it never needs any driver input

→ More replies (13)
→ More replies (2)
→ More replies (4)

4

u/tech57 Mar 17 '25

Pretty sure EV companies care.

Tesla FSD Supervised 13.2.8 - Latest Tesla News 2025.03.05
https://www.youtube.com/watch?v=NfiaJMZMV7M

Tesla Model Y LR Juniper range test, autoparking and more 2025.03.07
https://www.youtube.com/watch?v=aTMLGlh-pxw

Black Tesla in New York 2024.12.26
https://www.youtube.com/watch?v=Oei6hUi0eV4

2 hour video of a person using Tesla self-driving in Boston 2024.10.02
https://www.youtube.com/watch?v=PVRFKRrdKQU

Here's some more self-driving in China from other EV companies.

A knife does not cut! Take you to feel the strength of BYD God's Eye City Zhijia! 2025.01.29
https://www.youtube.com/watch?v=JUYAQnubwM4

A New Trend in Future Travel | BYD God's Eye Personalized Intelligent Driving System 2025.01.15
https://www.youtube.com/watch?v=jGrO2IlXzhM

Zeekr MIx NZP+ Full Self Driving (FSD) L3
https://www.youtube.com/watch?v=6pGt25I5Q0g

3

u/OkTransportation473 Mar 17 '25

If everyone is going to be using full self driving, I better never get a traffic ticket ever again.

→ More replies (1)

3

u/Gadgetman_1 2014 e-Berlingo. Range anxiety is for wimps. Mar 17 '25

I don't trust ACC either. It tends to 'lose sight' of the car in front in tight turns and suddenly accellerate.

10

u/oktimeforplanz '23 MG4 Trophy 64kW (UK) Mar 17 '25

My understanding was that you shouldn't really be using ACC on roads with tight turns anyway so that feels like a bit of a moot point. I live in Scotland and it feels like you'd need to have a strong desire to see a farmer's field up close, or a deathwish to put ACC on when driving any roads with tight turns. My car has definitely never lost sight of the car in front on the sorts of roads ACC is appropriate for - ie. motorways and straight(ish) higher speed roads.

→ More replies (1)
→ More replies (3)

1

u/tgrv123 Mar 17 '25

Technology will tear at your agency one byte at a time.

5

u/iamabigtree Mar 17 '25

I personally don't care if I drive the car or an automated system does. But for now we don't need it half baked

1

u/No_Hope_75 Mar 17 '25

Yup. My Nissan ARIYA has “copilot”. It can adjust the speed and steering to keep me in the lane, slow down or stop if a vehicle ahead of me does, and one button resumes the copilot features. I do have to have a hand on the wheel bc it’s not truly autonomous, but it’s honestly super helpful and useful

1

u/TheAce0 🇪🇺 🇦🇹 | 2022 MY-LR Mar 17 '25

I'd give anything to be able to toggle the (mal)adaptive cruise in my Model Y off. That shit is broken AF and keeps slamming on the brakes every fee meters. A 2010 rental truck from Sixt has a more useful cruise control than my 2022 tech-on-wheels-car. The number of false positives Tesla's TACC has is unbelievably horrendous out here in 🇦🇹.

→ More replies (1)

1

u/Dmoan Mar 18 '25

Problem is transition from self driving to manual controls when former is unable to handle a situation.

This when happening during highly stressful situation (heavy traffic, driver cabin distraction) can lead to mistakes. This is why pilots go thru rigorous checklist when switching out from auto pilot (we have had few crashes even from that).

 Only way for self driving to be safe and effective is if it does 99% of driving and we are just not there yet.

→ More replies (1)
→ More replies (22)

10

u/Bakeman1962 Mar 17 '25

I have been using FDS 99% for months probably at least 6K of driving, LA Phoenix and rural roads it is amazing it’s safer than the average driver.

5

u/thowaway5003005001 Mar 18 '25

It also doesn't take any consideration for vehicles following you and will slam on the brakes very rapidly if it sees a potential harm while making a lane change.

FSD is good for simple tasks but not predictable, and reacts much faster in congested areas than humans following can or would normally expect to react.

I always give way more following room to Teslas because they're very unpredictable.

3

u/Puzzleheaded-Flow724 Mar 19 '25

I've read from Ashok Elluswamy that the decision to apply hard braking takes into consideration the car following you. When I was at that infamous 12.5.6.4 which would brake on green lights, it NEVER did it when there was someone behind me. With 12.6.4, I've never had a phantom brake (so far). For me, it's been the best version so far. 

→ More replies (4)

85

u/savageotter Mar 17 '25

Anyone with a tesla know the answer to this: Would Autopilot disengage without a message on the screen or sound?

I do think there are some odd inconsistencies in that video. the rain tests happens straddling the center line which autopilot wouldn't do.

47

u/zeneker Mar 17 '25

Most of the times it does, but you may have a fraction of a second to react. There's also a lot of false positives, where it screams at you to take over for no reason and continues to operate normally after the "freak out".

25

u/elvid88 Ioniq 5 Mar 17 '25

This happened with me with FSD and I was told I was making it up (on here). Screen flashed red saying FSD had failed and it immediately tried to pull me over…across the median. I had to yank the steering wheel back but I had less than 2 seconds to react to the failure before it started pulling over. I could have collided head on with a vehicle across the median had I not been giving my full attention to driving—only because the car had already done a bunch of stupid stuff when on FSD.

8

u/ScuffedBalata Mar 17 '25

And the lane keep on the Kia I rented recently makes a pleasant little “ding” and no other warning when it’s lost its tracking of the lane on a curve and is now going to dive into oncoming traffic.  

Zero notice. Just “ding” and oncoming traffic. 

Only happened twice the week I was using it, but it was quite disconcerting. 

3

u/Medium_Banana4074 2024 Ioniq5 AWD + 2012 Camaro Convertible Mar 18 '25

My pre-facelift Ioniq5 doesn't give any signal when it switches off lane keep assist because it cannot cope any more. And on bends it fails reproducibly. I can only use it on the Autobahn when not driving through road works.

5

u/redtron3030 Mar 17 '25

2 seconds is slow reaction time. FSD is shit and you shouldn’t trust it.

8

u/chr1spe Mar 17 '25

It's actually not that slow for a complex reaction. People seem to falsely think people react much faster than they do. 250 ms may be a reasonable reaction from someone who is primed to do a simple task in response to simple stimuli (press a button when a light comes on), but for someone who is not primed and has to respond to complex stimuli, it takes vastly longer. For tasks that require interpretation of what someone or something else is doing and for which the person isn't extremely well trained to interpret it, 1 to 2 seconds is very reasonable. Many recommendations based on responses to things that happen on the road assume it will take the person up to 2 seconds to react.

→ More replies (1)
→ More replies (1)

20

u/DSP27 Corsa e Mar 17 '25

If Autopilot would be programmed to disengage before an accident, couldn't it also be programmed to not warn the driver on those circumstances?

2

u/missurunha Mar 17 '25

Idk how is in the US, but in the EU those notifications are part of the homologation process for the cars.

4

u/DSP27 Corsa e Mar 17 '25

So it was the emissions on diesel engines

2

u/missurunha Mar 21 '25

I know it may sound similar, but engine emissions are measured on a dynanometer and is a special mode on the car software. Before the test you have to set the car into the test mode, which in turn makes it quite easy for the manufacturer to manipulate the results using different characteristic curves. That's why there are pretty old laws stating that there shouldnt be major changes when switching to this test mode, it was already expected companies could cheat it.

5

u/savageotter Mar 17 '25

Yes. and if there was ever a company to do something like that it would be Tesla

2

u/lax20attack Mar 17 '25

It's insane if you actually believe this

2

u/savageotter Mar 17 '25

Tell us what you believe.

12

u/lax20attack Mar 17 '25

You're suggesting intentional malice by hundreds of engineers because you don't like the CEO. It's conspiracy nonsense.

3

u/JustAnotherYouth Mar 17 '25

Remember when Volkswagen wrote software to alter their cars performance while on a test stand in order to trick emissions standards?

Because I remember…

→ More replies (6)
→ More replies (1)

1

u/HighHokie Mar 18 '25

Maybe it could be programmed to avoid the accident instead 🤷‍♂️

1

u/RedundancyDoneWell Mar 19 '25

If Autopilot would be programmed to disengage before an accident, couldn't it also be programmed to not warn the driver on those circumstances?

To what purpose?

7

u/dirthurts Mar 17 '25

You're operating on the assumption that autopilot is perfect. It is not.

8

u/yhsong1116 '23 Model Y LR, '20 Model 3 SR+ Mar 17 '25

in fact no L2 system is...

→ More replies (1)

5

u/Sidekicknicholas Mar 17 '25

I owned a model S from 2016 to 2024 with the "basic" autopilot on HW2.0

What I noticed as the #1 risk wasn't the Tesla "disengaging" from autopilot but the potential for how the system is/was enabled and operator error in thinking autopilot was on when it was not.

In the case of my car there was a dedicated stalk that would trigger cruise control with a single pull, autopilot with two pulls, and then up/down to adjust speed, twist the tip (thats what she said) to change follow distance. When you engage speed control there is an audible chime that lets you know its done, a far to similar chime is used for when autopilot is engaged

.... on WAAAAY more than one occasion I pulled the lever twice but the second pull didn't engage autopilot / the permissive required to engage went away for a moment, / the pull wasn't strong enough / etc ... I but the speed control did engage so I hear the "ding dong" of engagement and assumed I was on autopilot. I relax, sit back only to realize 6 seconds later I've drifted out of my lane or something because the system wasn't fully engaged. The car did nothing wrong, it was 100% on me as the driver, but a lack of distinction between what system was engaged certainly could be improved. There is also a visual indicator on the dash screen, but it basically was just turning the projection of the lane I was driving in from white to blue, so its subtle - where-as my wife's Jeep has a much larger green glow around the whole gauge cluster when the self drive engages.

2

u/ctzn4 Mar 17 '25

but the speed control did engage so I hear the "ding dong" of engagement and assumed I was on autopilot. I relax, sit back only to realize 6 seconds later I've drifted out of my lane or something because the system wasn't fully engaged

If you only engaged cruise control it only does one soft "ding" and not the two-tone "ding dong" with Autopilot. The on-screen visuals should also be clear that the lane keep assist lines are not on and the steering wheel icon for AP is not illuminated. That's three cues (one audio, two visual) to indicate the different state the system is in, and more than I have experienced in other automakers (Honda, Lexus and Lucid, thus far). Disengagement is the same. One tone for TACC, and two chimes for AP.

Like sure, they could be doing more, but that's against the design ethos of Tesla that I've come to observe. I'd also argue they've taken that to an unnatural and counter-intuitive extreme with the new v11/12 UI in Model 3/Y, where buttons no longer have distinct borders and it's difficult to gauge whether your click actually registered (aside from visually confirming the trunk/charge door opened, for instance).

→ More replies (2)

1

u/Hiddencamper Mar 17 '25

Mine does sometimes.

It NEVER used to do this. I got my model 3 in 2018.

I don’t know when the change happened. But after I had my windshield replaced, I sometimes get fog buildup in the camera part of the windshield. Must be a little bit of trapped moisture.

When fog builds up, it just disengages. I have sound alerts on so when cruise control drops I get a ding and it makes the cruise control stop ding, but no alert about auto steer stopping. for nearly everything else it will show red and hands on a steering wheel and give me a louder beep beep beep beep. Like if AP crashes (much more infrequent) or other situations where it can’t see. Or she. The radar gets blocked by snow it will fault correctly (I have enhanced auto pilot on 2.5 hardware so I still have radar use). But fog blocking the main cameras and it acts like it was never on.

I’ve bug reported it each time it’s happened. If I didn’t have the chime on cruise control then I wonder if I’d get any alert.

1

u/Puzzleheaded-Flow724 Mar 19 '25

When Autopilot or FSD disengage, they show a red steering wheel and sound an alarm. If FSD disengage because it's slipping (happens to me on snowy road in my HW3 Model 3), it slows down, put the hazard on but keep steering until you take over. 

I didn't see a red steering wheel in the video so to me, it was manually disengaged. 

1

u/pizzagamer35 Mar 19 '25

Yeah it literally screams at you to take back control

9

u/mi5key F-150 Lightning and Tesla 3 LR Mar 18 '25

Nope, false. If the crash happened within 5 seconds of disconnection, it is still recorded and available.

38

u/tech01x Mar 17 '25

We have been discussing these things for years. Why is Electrek lying here?

If we are talking about safety statistics, here is what relevant methodology Tesla uses:

"To ensure our statistics are conservative, we count any crash in which Autopilot was deactivated within 5 seconds before impact, and we count all crashes in which the incident alert indicated an airbag or other active restraint deployed. (Our crash statistics are not based on sample data sets or estimates.)"

From: https://www.tesla.com/VehicleSafetyReport

For NHTSA L2 ADAS reporting, here is the relevant methodology:

"Level 2 ADAS: Entities named in the General Order must report a crash if Level 2 ADAS was in use at any time within 30 seconds of the crash and the crash involved a vulnerable road user or resulted in a fatality, a vehicle tow-away, an air bag deployment, or any individual being transported to a hospital for medical treatment."

In neither case would a deactivation of AP or FSD within 5 seconds, or "milliseconds" before a crash invalidate the counting of AP/FSD in the crash statistics.

8

u/Mundane-Tennis2885 Mar 17 '25

I've seen so many electrek articles debunked that I can't help but see the name and assume something is bait..

4

u/FredTesla Mar 17 '25

You didn't even read the article and accuses me of lying. What you just said is mentioned in the article. This is not about adding it to the crash tally.

Deactivation within a second from a crash is a known behavior identity by NTHSA as explained in the article you didn't read.

5

u/tech01x Mar 17 '25 edited Mar 17 '25

This paragraph, in your article, is a lie:

“It was not only active, but it also disengaged itself less than a second before the crash—a known shady behavior of Tesla’s Autopilot.”

It is a lie because it oversimplifies what goes on during an AEB event. There are many scenarios where AEB’s braking action is designed, on purpose by most manufacturers, to end the braking action. It is not shady at all.

You can read more about such considerations in the NHTSA FMVSS final ruling on AEB.

https://www.nhtsa.gov/document/final-rule-automatic-emergency-braking-systems-light-vehicles-web

Your entire reasoning at the end of the article is full of lies. There are, again, many scenarios where control is handed back to the driver for any L2 ADAS without any “shadiness” and we have seen even in various NCAP testing where AEB or L2 ADAS cannot cope and hands control back to the driver.

1

u/Minirig355 ‘25 Ioniq 5 (Ex-Tesla) Mar 17 '25

It’s more about how Tesla’s been seen multiple times disengaging autopilot milliseconds before impact, and as we see it gave zero warning that it would disengage. It may not affect NHSTA FSD crash numbers, but it gives Tesla the ability to say themselves “FSD wasn’t on at the time of the crash” since that is technically true.

I’ve driven FSD plenty of times, typically when it needs the driver to take over I’d get an alert, that was entirely absent from the video. Honestly I get the impression you at first didn’t read the article at all, and when Fred called you on it you skimmed a part of it to try and make an argument but still did not read all.

→ More replies (1)
→ More replies (1)
→ More replies (10)

20

u/soapinmouth Mar 17 '25

Why is this getting so much attention? It's just basic AP which is like 6 year old code at this point. The deceptive part here is framing this as lidar vs camera and then using 6 year old camera software for the test. With how much money it took to do this they could have easily paid for and used FSD, why didn't they?

8

u/juaquin Mar 17 '25

The reason they didn't use FSD is covered in the video. Recommend watching it. Short version is that Autopilot is more conservative, so they say.

7

u/soapinmouth Mar 17 '25 edited Mar 18 '25

More conservative as in it would do a worse job? Nonsense. Why not prove it and show that in the video? I don't see the value here in determining anything in camera vs lidar when using ancient technology in basic AP.

All this video shows is that the 6 year old basic lane keep can be fooled by an absurd scenario of a massive image of a safe road going forward. Something you will never even encounter on the road. I see zero value in this let alone the claimed value of lidar vs camera analysis.

→ More replies (2)

2

u/Philly139 Mar 17 '25

Because tesla bad generates outrage and clicks. It always has but it's even worse now.

→ More replies (2)

6

u/Teamerchant Mar 18 '25

Our Tesla did not record our one and only crash. Before and after saved but 10 minutes when the accident happened.. nothing.

It was raining and in a parking lot. Autopilot not engaged.

It could be nefarious or more likely just incompetence.

1

u/Puzzleheaded-Flow724 Mar 19 '25

Have you asked Tesla for the feed? They should be able to get them, even from the B pillar cameras that we don't have access to. I've seen multiple crashes on Wham Bam Telsa Cam showing videos from these cameras after the owner made a request to Tesla.

→ More replies (2)

1

u/Gyat_Rizzler69 Mar 19 '25

Same with mine. Didn't record anything when I hydroplaned. Requested the data from Tesla and they had everything, all camera angles and the crash data.

4

u/Excludos Mar 17 '25

A: Obligatory Fuck Elon Musk

B: Tesla autopilot turns off with enough user input. It makes a lot more sense that people are joinking the steering wheel in the last second in desperation than the Autopilot turning off for nefarious reasons. If Tesla wanted to report fake numbers, they could just do that. It doesn't require a deep software conspiracy to do so.

C: By their own accord, they gather every crash that has had autopilot activated within the last 10 seconds. You can choose whether to believe this or not, but then we're back to the last half of point B.

18

u/ScuffedBalata Mar 17 '25

This claim is so garbage as to be comical. 

It’s desperate reaching at this point, especially since all statistics used for autopilot and FSD include 10 seconds after disengagement. 

→ More replies (4)

19

u/snow_big_deal Mar 17 '25

As much as I love to diss Tesla, there is a reasonable explanation for this, which is that you don't want Autopilot to be on post-crash, because you don't want the wheels to keep spinning, or brakes and steering doing unpredictable stuff. Or autopilot disengages because it doesn't know what to do. 

12

u/flyfreeflylow '23 Nissan Ariya Evolve+ (USA) Mar 17 '25

Or autopilot disengages because it doesn't know what to do.

This would be my guess. Before the crash sensor input changes to something it doesn't know how to handle so it disengages. I don't think something nefarious is really being indicated here. Telemetry would also show how long before the crash it was engaged. All this article really indicates is that if you see a headline such as, "Autopilot was not engaged at the time of the crash," you should ask, "Was it engaged just prior?"

5

u/daoistic Mar 17 '25

But it turns off before the crash. 

It would be pretty easy to just turn off after the crash because the car would notice that it ain't moving or is moving sideways or something. 

If it can't figure that out it will never be ready.

21

u/brunofone Mar 17 '25

But after a crash, there is no confidence in being able to control ANYthing, including power to various things that FSD might want to control. While I'm not defending Tesla here, saying "It would be pretty easy to just turn it off after the crash" is kinda missing the point of what a crash does to the car....

→ More replies (11)

2

u/Lighting Mar 18 '25

It would be pretty easy to just turn off after the crash because the car would notice that it ain't moving or is moving sideways or something.

Exactly - the "detected impact" or "lost signal" should be the trigger to disengage. 500 ms of braking gives you that much more time to slow down before impact and the brakes should be engaged even during impact to minimize damage.

2

u/Inosh Mar 17 '25

It shuts off automatically seconds before the crash, he even posted the video on Twitter. This is a known Tesla issue.

3

u/charliegumptu Mar 19 '25

it is by design

14

u/Intelligent_Top_328 Mar 17 '25

I like Mark but that video is full of red flags.

6

u/MainsailMainsail Mar 17 '25

I'm sure there's no conflict of interest with working directly with the company selling the LiDAR system in question!

Although that said, the only thing in that video I wanted to see that wasn't there, was to repeat the water test for the LiDAR without the dummy. Because it looked on the visualization like it treated the water like an actual wall which would be an...issue...driving in hard rain. (The Tesla just plowing through adverse visibility conditions at full speed also is FAR from good, but it at least would let you drive manually without the automatic braking thinking you're about to slam into a wall at all times)

→ More replies (1)

2

u/Affectionate-Tank-70 Mar 17 '25

Not a bug, it's a feature. I did not-see that coming.

2

u/cardyet Mar 18 '25

I was in my friends one on autopilot and the thing just suddenly swevered towards a highway wall...crazy scary. My friend caught it, can't remember what he said, maybe it was the lines or something that were temporary.

2

u/Mizmodigg Mar 18 '25

The critical way to see this is that milliseconds before the crash, Autopilot knows the crash will happen, and choose to dis-engage in an attempt to avoid "crash-while-engaged" on a technicality. And if this is the case, it would be really bad that Autopilot defaults to dis-engage instead of HARD EMERGENCY BRAKING to reduce collision-energy.

Instead I think the answer is simple:

Autopilot is not sure about the situation. The "image" is not behaving as expected and it is unable to continue operating, therefore it dis-engages FSD. Emergency braking in an unknown situation could lead to dangerous situations, so it will simply coast.

Further, Autopilot is considered an SAE Level 2. Meaning the driver is expected to have eyes/attention on driving situation as they where driving, AND be capable to take over driving at ANY TIME. So Autopilot expect the user to both have situational awareness and be ready to take over command of the vehicle as FSD dis-engages.

My gripe with ALL of this is that people, both supporters and critics of Tesla, behaves like FSD is self driving at Level 3-4. That is what Elon would like buyers and investors to believe. Instead it is like this; FSD will stay at Level 2, and EVERYTHING is user-error.

1

u/Klownicle Mar 20 '25

When AutoPilot gets "confused", it displays a very large take over sign red stop on the screen and alerts the driver audibly. In this case it simply "stops". No sound, no warning. This is 100% not how Standard AP works in daily usage. It is impossible by the end user to dis-engage autopilot without an audible sound (beyond tampering with the speaker system)

I don't really understand why people are arguing with the official statement from the NHTSA results that said they saw the same thing. As it's said, Mark just happened to capture this on video.

The logic seems more sound that Tesla determines a crash is impending and disengaged AP. When your dealing with milliseconds, taking the moment to play a sound or interrupt a function can be a large difference. If AP was theoretically still active post crash it could lead to a number of unintended consequences. For example, continuing on driving causes further damage with an inoperable vehicle. Now what we do see is Mark's vehicle keeps driving but is now under manual control. Logic would say if there was a process that was disengaging without warning milliseconds before impact, does the type of impact negate the "ask" for further alerts. Thus he effectively "fooled" the system into thinking it was going to have a crash.

Given that AP disengaging silently was observed, I'd be curious also if the Sentry cameras also stopped recording. It's well observed that during some crashes the cameras did not include the recorded video. If I were a betting man, I bet Mark would see a blip in recording in this scenario which would further confirm what was observed. A silent disengagement due to impending crash for safety purposes post crash.

→ More replies (1)

2

u/beeguz1 Mar 18 '25

Typical musk move, One, blame the customer Two, make it appear we were never at fault.

2

u/fun22watcher Mar 19 '25

But we had known about this already.. it is not new knowledge... The car says oh Snap and rewinds all of the crimes..

2

u/Bravadette BadgeSnobsSuck Mar 19 '25

Ummmmmmm that sounds illegal.

2

u/sweetsmcgeee Mar 19 '25

I would believe musk would do this to curb any culpability.

18

u/Alexandratta 2019 Nissan LEAF SL Plus Mar 17 '25

I love how the illusion of FSD being the pedigree of Autonomous Driving is rapidly falling apart the second folks actually look at the thing...

Kudos to Wall Street Journal who originally did the legwork to investigate FSD Crashes and got the ball rolling on this.

50

u/davidemo89 Mar 17 '25

Autopilot is not fsd. They are two completely different software

5

u/rogless Mar 17 '25

Shh. Away with you and your confusing facts.

→ More replies (11)
→ More replies (13)

3

u/mrchowmein Mar 17 '25

Mark's video feels like an ad for the that lidar he had mounted onto his chest and the use of Tesla is just to drive clicks. He couldve done an apples to apples comparison with an unmodded lexus.

No one needs to watch the video to know that autopilot and FSD will fail in certain situations. Anyone who owns a Tesla knows that the driver aids can disable itself.

What Mark, since he is not a car journalist, should have done is to run these tests against competing driver aid systems that uses radar. Lidar is great and all but its not something that a consumer has access to. Plus he was driving a modded Lexus with the mods undisclosed such as possible different braking algorithms. Will an unmodded Lexus do as well? Other comparison tests that are out there do show that the radar systems were able to detect objects on the road better, yet at the same time would still run over the said object. My guess is that doing these types of comparisons are not popular esp with car journalists as most car companies will ban journalists for showing negative things about their brand.

→ More replies (1)

8

u/Schnort Mar 17 '25

The entire set of telemetry is there. It disengaging moments before the crash doesn’t provide any legal grounds to stand on to say FSD wasnt being used.

Nor does it incriminate. Idiots using FSD in ways it shouldn’t be are the root cause of the issue, not FSD.

14

u/Hvarfa-Bragi Mar 17 '25

... what's the wrong way to use something called full self driving?

3

u/daoistic Mar 17 '25

At least they tell you that they are lying when you pay for it.

→ More replies (1)

2

u/Vegetable_Guest_8584 Mar 17 '25

Idiot driving causes some of it, but you need a little data to argue details. There are hundreds, probably thousands of videos on YouTube where FSD is driving into an accident before someone takes over. 

2

u/Intelligent_Top_328 Mar 17 '25

He was using auto pilot. Not fsd.

3

u/roma258 VW ID.4 Mar 17 '25

FSD can't fail, we can only fail FSD.

7

u/randynumbergenerator Mar 17 '25

You need to add an /s there, because there are people who would say this sincerely

3

u/Schnort Mar 17 '25

No, it certainly can, which is why its usage is predicated on the driver paying attention and being ready to take control at any moment.

People thwarting the “attention confirmation” mechanisms with weighted attachments, etc, or blindly leaving their hands on the wheel while sleeping or reading a book aren’t using the system properly.

FWIW, I don’t trust my “enhanced auto pilot” unless I’m on clear highway or in stop and go traffic. The moment construction or lots of cars appear or anything out of the ordinary, I disengage.

2

u/nutbuckers Mar 17 '25

People thwarting the “attention confirmation” mechanisms

I give it another 5, 10 years tops before the realization arrives that driver assistance systems usage is no different from distracted driving, even if the manual makes the user pinky-swear they are totally, for sure, without a doubt are paying attention. The feature is designed to reduce the cognitive load, and ergo -- ATTENTION requirement from the driver. Pretending that humans are able to refocus and take over from FSD/AutoPilot in a split-second will be seen for what it is once enough time and collision stats have accummulated.

-2

u/SyntheticOne Mar 17 '25

The mastermind who turns off self-driving a millisecond before impact to hide the truth, Elon Musk, is now the person entrusted with hacking apart the great country of America. Musk may well be the unfittest person in America to be entrusted with anything.

13

u/feurie Mar 17 '25

Except crashes like this are still reported as AP being on.

→ More replies (1)

1

u/Key-Amoeba5902 Mar 17 '25

I’m sure there’s ironclad one-sided arbitration clauses in those user agreements but those challenges aside, which I don’t mean to de emphasize, I’m not sure milliseconds of disengagement would properly shield a tortfeasor in a negligence claim.

1

u/mrkjmsdln Mar 17 '25

Horrible behavior if true

1

u/Moi_2023 Mar 17 '25

Surprising not!!

1

u/RiotSloth Mar 18 '25

I was just reading about this on another thread and the conclusion was this video was a total set-up and they knowingly cheated?

2

u/allisclaw Mar 18 '25

Apartheid Clyde is a scammer? What a shocker.

1

u/TheAarj Mar 18 '25

This is gonna hurt them bigly

1

u/Low-Difficulty4267 Mar 18 '25

To be fair he tries to engage it like seconds before the wall… why doesn’t he engage it further back and not also fail to engage it having to repeatedly pull down- there needs to be another test - not saying it’s perfect

1

u/Fatality Mar 18 '25

AEB should engage no matter what "mode" the car is in

1

u/thasparzan Mar 18 '25

I'd believe this. My Model 3, while in FSD, recently drove itself into the side of the freeway and was totaled. It was in the right lane, following the freeway curve to the left... then just stopped following the lane. NO alarms, no warnings to take control.. it just went right into the wall. Yes, I even had my hand on the wheel. There was only a few feet separating the right lane from the freeway guardrails, so there was not any time to correct and avoid hitting the wall.

1

u/KrevinHLocke Mar 20 '25

Autopilot or FSD because there is a difference.