Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Tesla’s Autopilot involved in far more crashes than previously known (seattletimes.com)
79 points by abawany on June 11, 2023 | hide | past | favorite | 41 comments


The real number may be far higher.

Somebody bothered to make website to track Tesla deaths and Tesla Autopilot related deaths.

https://www.tesladeaths.com/

Current count is 33 for Autopilot related deaths.

Last time I cross referenced incidents it was accurate, but that was a while ago.


It is very important to note that that website has a clear bias against Tesla by its very nature, it's trying to pain Teslas as death machines. I've never heard of any page like this for any other manufacturer.

Case in point, Case 312.1 is a pickup truck that ran a red light at an intersection and hit the Tesla, killing its driver, according to the archived new report. The stats on that page shows it as death in "TSLA+cycl / peds" column, even though there were no vulnerable road users involved.

Any reasonable individual would see that this has zero fault on the Tesla, but since it involves a Tesla and a death, it's included.

Involved in an accident does not mean the Tesla is at fault.


The website states very clearly, in bold, at the very top:

> Tesla Deaths is a record of Tesla accidents that involved a driver, occupant, cyclist, motorcyclist, or pedestrian death, whether or not the Tesla or its driver were at fault

The FAQ contains the reasoning behind that, which is very simple:

> Q: Why do you include deaths where the Tesla was not at fault?

> NHTSA death statistics cited by Elon Musk do not differentiate between at-fault and not-at-fault. All manufacturers are treated the same way. For the purposes of calculating Tesla fatalities in accordance with NHTSA methodology, we also do not include the many deaths in other vehicles, despite Teslas being at fault in the majority of those cases.

Note that since this is only based on press reports, this is a lower bound on the number of deaths. Probably by a signinficant margin.


For reference there are over forty thousand automobile related deaths in the US every year.


It would be interesting to see what the rate of car accidents per car make and model are.


What exactly are the main claims being made here? On a skim I don't think I saw anything to support the headline. I did see that they were suggesting the rate of crashes was rising quickly in the past year, which seems believable.


I'm wondering about the school bus story here, as a non-USA person. Does the school bus display flashing warning lights to indicate that a child is crossing in front of the bus while it is stationary? Seems like a dangerous situation (that the driver should have slowed for). Is crossing in front of the bus really safer than making the child cross after the bus has departed?


Typically the bus has a stop sign with flashing lights that automatically swings out when the bus driver opens the doors. The law in most places in North America is that it is illegal to pass the school bus when students are embarking or disembarking, so in practice it is safer for the students to cross the street while the school bus is there.

Image of a typical North American school bus: https://en.m.wikipedia.org/wiki/School_bus_traffic_stop_laws...


Where I am, when school busses make a stop to pickup/dropoff kids it does a lot of signaling. Big red lights at the top blink, a stop sign with lights flips out from the side. It is against the law to pass a school bus when it's lights are on in this configuration from either direction. I've heard of districts equipping cameras on busses where there are incidents of illegal passing like this.

Also, back in the day there was a little barrier that would flip out hinged on the passenger side that would attempt to prevent kids from walking out in front of the bus, at least so close that the bus driver couldn't reasonably see. I don't know if those are still common or not.


They cite an incident there where the driver didn't see the motorcyclist... if that's the case its hard to see how this case would be on Tesla


Tesla claims it has 8 cameras, all around the vehicle, precisely so that it can handle cases like that.

The fact that the car, despite having 8 cameras, all around the vehicle, still failed to notice the motorcyclist is a damning indictment of the poor quality of Tesla's vision software.

My Subaru has absolutely no problem detecting cars, cyclists, or pedestrians around or behind my vehicle, and Subaru doesn't even have a best-in-class vision system.


And Cruise with tens of cameras, lidars, radars, ultrasonics rear ended a bus[1].

I'm not trying to bash Cruise. My point is that shit happens, we should look at the complete statistics, not one offs.

[1] https://www.kron4.com/news/bay-area/cruise-car-appears-to-re...


"damning indictment"

Okay. Who out there has better vision software that can do as much as Tesla?


How does this compare to Tesla’s FSD? AutoPilot is just lane keep assist plus intelligent cruise control (like, cruise with distance assist according to the car in front of you), but most new cars these days have something comparable. FSD is usually what we are more likely to debate on HN.


I think it's important to distinguish Tesla's autopilot from other manufacturers' because, to my knowledge, Tesla is the only manufacturer attempting to build this kind of advanced driver assistance exclusively using camera technology and computer vision. All other manufacturers use various sensors such as radar, lidar and sonar as well as cameras.


> Tesla is the only manufacturer attempting to build this kind of advanced driver assistance exclusively using camera technology and computer vision

Tesla is also far less cautious in its marketing. Most reasonable complaints about Tesla's self-driving kit aren't about the kit per se, but the false expectations that encourage drivers to confidently send it where it shouldn't go.


I’m pretty sure no one is using LIDAR for lane keep assist and auto distance adjusted cruise control. I’m not sure if my BMW i4 uses radar for distance adjusted cruise control, definitely doesn’t use it for keeping the lane.

But I wonder how often the equivalent fails for other feature equivalent systems since they are becoming very common now, surely some of them are failing. My system warns me each time not to rely on it to avoid crashes, for example. I get that Tesla gets most of the attention, especially since they also offer FSD as well, but surely there are failures from other car manufacturers that are simply not getting the same microscopic attention.


MobilEye (which Tesla actually used to license for 'AP1') exclusively uses camera technology, even with a single camera in some (most?) cases. It handles TACC, lane keeping, and AEB. Extremely similar to the Autopilot offerings (not FSD, though).


> Authorities said Yee had fixed weights to the steering wheel to trick Autopilot into registering the presence of a driver’s hands

Wow, just wow. People, don't do stupid stuff like this with security features.


I would never do this, but autopilot nags the shit out of you when you’re driving. You have to apply significant turning force for it to register, and then it will do it again very quickly after that. It feels so counterintuitive to “steer away” when it’s actually doing the right thing.

I can 100% understand that people would want a workaround. It almost makes it not worth it. Some sort of capacitive detection of hands would make it a much better experience.


>The school bus was displaying its stop sign and flashing red warning lights, a police report said, when Tillman Mitchell, 17, stepped off one afternoon in March. Then a Tesla Model Y approached on North Carolina Highway 561.

>The car – allegedly in Autopilot mode – never slowed down.

>It struck Mitchell at 45 mph. The teenager was thrown into the windshield, flew into the air and landed face down in the road, according to his great-aunt, Dorothy Lynch. Mitchell’s father heard the crash and rushed from his porch to find his son lying in the middle of the road.

>“If it had been a smaller child,” Lynch said, “the child would be dead.”

Elon should really stop the yearly Level 5 promises. Also, that driver should be charged with negligence.


"Also, that driver should be charged with negligence."

It's surprising this isn't the first (and maybe only) salient point. Autopilot is a slightly more capable cruise control. If a driver sets a car to drive 45 mph and then stops paying attention, who's at fault when the car hits something? The car was doing exactly what it was supposed to do.

(As pointed out from another commenter, the article is about Autopilot, not FSD.)


> Elon should really stop the yearly Level 5 promises. Also, that driver should be charged with negligence.

AutoPilot has little to do with FSD though. I’m assuming these are accidents related to autopilot rather than FSD, or the article would have been titled differently?


...fixed weights to the steering wheel

I have to wonder what else the driver was doing. Perhaps sleeping or playing a game on their phone. I am glad they threw the book at this person and I am a Tesla fan.

I don't own a Tesla yet but I will shortly. I now have a Subaru with EyeSight so I am accustomed to some extensive driver assistance. Especially on the highway but you still need to pay attention.

I do a lot of business travel and have rented multiple Tecla's, a Chevy Bolt (no assist), and a Polestar (limited assist). The Tesla driver assist is outstanding but is still only assistance. It is not fully driving the car for you and you need to identify and pay close attention in more complex situations.


Are these numbers normalized for the number of cars and car-hours?


No, they are not. https://archive.is/dTInm And the article leads with an anecdote but buries critical details about that specific accident (the driver had "rigged weights" to keep autopilot going even when his hands were off the wheel).

I am not aware of any truly fair and impartial comparison of modern Teslas with other modern self-driving (whatever level) car technologies that have shown that Teslas are truly prone to more accidents. Fair and impartial would involve comparing rates, rather than absolute values (although that's just a starting point), correctly determining what model and what technologies were enabled, along with the driver's actions. Most newspapers are looking for clicks and "tesla autopilot crashes" get clicks, and the papers don't have time to do high quality investigative journalism.

(i have no opinions on the matter; I don't own a tesla, don't know much about their technology beyond what I read in the press, and have limited real-life experience with "self-driving", basically just the standard lane-keeping and radar cruise control that's included with my car, which I don't use often, and have found is prone to all sorts of problems)


Years before Tesla released AP Lexus had walled off ACC behind eye tracking.

They weren't offering half the functionality AP claims and they realized the need to track driver attentiveness more robustly than looking for steering wheel torque.

And even after incidents like this Tesla resisted for years before actually using their cameras for eye tracking


Tesla does not provide government regulators or research organizations access to the raw data on Autopilot or FSD usage. Despite their claims that FSD is a autonomous vehicle system in beta testing with a safety driver, they deliberately under-classify it to the California DMV to bypass [1] the mandatory reporting requirements for autonomous vehicles under test with a safety driver [2]. As such, they have intentionally made it impossible for any untainted safety analysis to occur.

As the default assumption when dealing with safety-critical systems is that they are unsafe and the explicit burden of proof is on manufacturer to prove safety, we must assume that it is unsafe. In addition, as no untainted safety analysis can occur at this time, any comparison against alternatives is impossible until they stop blocking audits of their usage data. The only thing we can state for certain is that it is unsafe for any usage by customers since no untainted proof of safety has been produced.

[1] https://thelastdriverlicenseholder.com/2023/02/17/2022-disen...

[2] https://www.dmv.ca.gov/portal/vehicle-industry-services/auto...


No they are not. However they do mention that 40,000 people died in car incidents last year.

But the 17 involving a Telsa is where we need to pay attention. LOL


It's even worse for Tesla if you normalize the data, because the rest of the industry also has advanced driving functionality with far fewer accidents or deaths.

It turns out that one of the things that radar is good for is emergency braking regardless of lighting conditions.

It also helps that other carmakers don't oversell the functionality of their advanced driving functionality; if anything they deliberately understate how well the systems work to avoid giving customers the false sense of safety that Tesla does.


Yeah I love this line of argument from the Tesla cult: “But wait people use their Tesla autonomous features more frequently [in conditions the system doesn’t operate well in].”

Like… yeah, that’s called an unsafe system my dude.


Safety of systems isn't binary, so what's the risk profile?

Newer medical treatments have higher risk, just like experimental transportation methods carry greater risk.

Everyone gets sick/injured and everyone needs to get around, so some suffering in the name of advancing these endeavors seems both inevitable and tolerable. The question is to what degree?

Controlling access to and fallout from these automated driving systems is a temporary priority since the roads they're testing on are far more 'public' than an individual's body undergoing new medical treatment, but the long-term priority must be on getting the systems as safe or safer than other human drivers on aggregate. That will happen, sooner or later, and I'd rather see it sooner as long as the cost of the race isn't catastrophe. The only way to advance is to let it learn...


Yep, and when there is a dozen companies taking their obligations to the public extremely seriously, loading their cars with at least as many sensors as it takes to operate safely for the R&D phases, deploying in limited phases and not directly into the hands of the public, and not marketing their unproven tech as “Full Self Driving,” I think it’s completely reasonable to single out the one company who’s not doing any of that.


Tesla's insistence that vision is all you need is one of the root issues.

Humans are able to drive with vision alone in part because we have a high-level conceptual model of the world that we can rely on to fill in the gaps when a straight literal interpretation of vision is wrong or inadequate. We know, for example, that there are not likely to be walls under highway overpasses and that stop signs on billboards are not signs. This is a "strong AI" level problem that a car isn't going to be able to to solve, so instead the best answer is to give the car super-human senses to it doesn't need such a model as much.

That and if we have self-driving cars we want them to be safer than human drivers. That could only reasonably be achieved with super-human senses.


> even worse for Tesla if you normalize the data

Where's the data that proves this? Tesla has ~2% of the auto market, but ~0.4% of deaths.


Most higher end vehicles (and frankly, I don't count Tesla as particularly high end, particularly not the Model 3) have similar stats, for a multitude of reasons (including but not limited to driver experience and time behind the wheel as a function of affordability of the vehicle).


Where's your data to back up your claim?


~2% of new cars isn't anywhere near 2% of cars on the road or miles driven, and newer cars are safer in general.


miles driven, not market share, is the correct comparison here (and even then you'd want to include many other covariates, such as where the cars are operated, socioeconomic status of drivers, etc, etc) if you really want to compare different vehicles.


I don't see how an absolute number is reportable. The only thing that's important is crashes/driving hour and then compare it human drivers.


Perhaps these Tesla APs have achieved sentience. Whether these are murders or suicides, idk.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: