Skip to content

NHTSA releases non-useful crash stats for driverless cars

The NHTSA reported statistics today on crashes of self-driving cars over the past year. The numbers are all but useless, but here they are anyway:

Tesla leads the pack among Level 2 cars, which have limited self-driving capability. Among cars with higher levels of automation, Waymo is by far the highest.

But in neither case do we know how many miles these cars have driven. Tesla and Waymo have almost certainly driven far more automated miles than anyone else in their respective categories, and I suspect both would have pretty low numbers if you calculated crashes/mile—or, more crudely, even crashes/vehicle.

But we don't have that unless the car companies provide that information for the relevant time period. In the meantime, this is all the NHTSA has.

31 thoughts on “NHTSA releases non-useful crash stats for driverless cars

  1. skeptonomist

    Just the crash rate for these cars relative to non-"self-driving" cars might be of some use, although the difference might not be significant. Human drivers also fall asleep, get distracted, etc. and plow into parked cars, don't they? Ultimately we need to know whether "self-driving" cars are safer or more dangerous. You also need to know whether to rely on the feature if your car has it, but the statistics are very limited as Kevin says.

    Surely the NHTSA has crash rates for different vehicles and some "self-driving" fanatic like Kevin could track that down.

    1. Mitch Guthman

      There appear to be manipulations of accident statistics by companies like Tesla that are suspected but difficult to prove due to the proprietary nature of the software. There is a very widespread suspicion that, for example, Tesla being driven in "autopilot" mode are programmed to return control to the driver a few seconds before an unavoidable crash so that the accident will be down to "human error" rather than a gross malfunction of the car.

      If this is true, either automakers need to have one uniform program for self-driving vehicles over which they have no control or it will be forever impossible to know whether they are as safe as or more dangerous than human drivers.

  2. Mitch Guthman

    I think we can safely assume that the data which the self-driving car companies are not releasing is not helpful to their cause. Were it otherwise, I think we can be certain that the data would be released by those companies.

  3. Zephyr

    Humans can dramatically reduce their own accident rate by not drinking and driving and simply paying attention. There's lots that can be done to be a better driver, but there is nothing I can do to make autonomous driving be better--I am at the mercy of the machine. I personally do not want that. In fact, I find all the supposed driver's aids are just distractions. Putting my life in the hands of supposed "self-driving" cars would be very stressful.

  4. Salamander

    Do these data include or distinguish the "accidents" in which Tesla drivers deliberately accelerate towards obstacles to show off how their car will "save" them, but it doesn't?

  5. rick_jones

    The numbers are all but useless

    Then why repeat them? Seriously, why? If we are all supposed to be making rational, informed decisions based on sound data, why contribute to what is tantamount to gossip?

    I could kindasorta understand if you were still writing for Mother Jones, which probably needs the eyeballs and clicks, but this site isn't advertising or viewer supported.

    1. arghasnarg

      ... To point out that the NHTSA is either useless (generating non-actionable information) or worse than useless (running cover for car makers)?

      > but this site isn't advertising or viewer supported.

      Let's see, you were motivated enough to voice your confusion, but not motivated enough to reflect on the fact that you don't understand what you're commenting on. I think you're looking for Twitter, it is ---> thataway.

  6. Austin

    Why isn’t this data presented in 2 pie charts, instead of bar charts? The percentages are supposed to be “percent of the whole” but that’s very unclear given how it’s displayed.

    1. Austin

      When should I use a pie chart?

      Pie charts are probably better than any other visual for expressing a part-to-whole relationship. When you hear “percent of…” or “part of…” that’s one indication a pie chart could meet your needs.

      There are two primary use cases for a pie chart:

      - If you want your audience to have a general sense of the part-to-whole relationship in your data and comparing the precise sizes of the slices is less important.

      - To convey that one segment of the total is relatively small or large.

      https://www.storytellingwithdata.com/blog/2020/5/14/what-is-a-pie-chart

    2. 3j0hn

      I was going to ask if Kevin is making a VERY subtle jab at the unhelpfulness of the data from NHTSA by putting it into an unhelpful visualization?

    3. MindGame

      HA! Thanks for explaining this. I first was thinking that 70% of Teslas had crashed, which struck me as a "bit" high. LOL

  7. Doctor Jay

    I don't know what these numbers even represent. They are a percentage. Percentages are ratios. Presumably the numerator is the number of crashes, but what is the denominator? I have no idea, not even a hypothesis.

    1. jheartney

      They are percentages of the subset of autonomy systems in each category, by manufacturer. So 70% of the Level 2 crashes were Teslas, 23% were Hondas, and 7% were other. Same with the Level 3+ systems.

      Putting them in pie chart format would have made this clearer, but would also have highlighted how utterly useless the numbers are, as given. We don't know what percentage of cars this represents out of each manufacturer's fleet, or how many miles were driven, or what sort of driving domain (city or highway) the crashes happened on. Also no way to know if the self-driving system was at fault in the crash.

      But it's even worse than that -- it's questionable if these are even comparable numbers. Teslas have real-time telematics reporting crashes immediately. The other manufacturer numbers apparently come from after-the-fact accident reports. We also have no idea if all the data used comparable criteria for saying what a crash was.

      I personally thing Tesla's Full Self Drive is a colossal scam which is unlikely to ever achieve actual autonomy, and the implementation is irresponsible and reckless. But we learn nothing at all germane about that from these numbers.

  8. Doctor Jay

    There is a very real fear of giving up control of the vehicle to a machine, and I get that.

    And, even right now, ADS vehicles have about 1/20th the number of accidents per mile that human driven vehicles have. And maybe 1/100th the number that they have caused.

    Among that 1/100th are some very odd accidents - there's probably a few that would be extremely unlikely given a human driver.

    And yet is those stories, and those fears that drive our human narrative, not the statistics.

    This is a rare moment where we as liberals, who are usually decrying this when we see it in conservatives (where it is common, for sure), are more subject to it.

    We will get over this. We will come to accept AVs as part of life, or maybe just a younger generation will, and us oldsters never will. There will be a pathway that allows people to dip their toes in the water, and feel better about the whole thing, having seen it for themselves. I don't know what that will be, though.

    1. KenSchulz

      It isn’t the stories for me; I know about availability heuristic; it’s knowing that the people developing self-driving software can overlook the obvious, just like the software developers I used to work with.
      The Uber car that killed the pedestrian in Arizona had multiple stupid oversights: 1) the object-recognition software wouldn’t classify something as ‘pedestrian’ unless it was in or near a crosswalk; 2) it didn’t swerve to avoid a collision because it didn’t monitor adjacent lanes for nearby vehicles*. Not realizing that people jaywalk is stupid enough, but the latter omission is IMHO inexcusable. We humans are limited by where evolution put our eyes, but a machine can look in every direction at once. Why wasn’t that implemented?
      *This has always been a part of my defensive-driving repertoire. It saved me from collisions several times when I had a weekly commute along most of the Garden State Parkway.

      1. Doctor Jay

        Yeah, that was a bad one. I wouldn't trust Uber as such, either. There are lots of people doing this, and presumably, there are people learning how to test AVs as well - what to look for, etc.

        The question that I would ask is, though, how many times a year does a human driver do something dumb like this?

        Also of note is that here you are with an anecdote, as opposed to statistics. This is how human beings work.

        1. KenSchulz

          Not an anecdote, any more than is an NTSB report of an air crash investigation. Statistics are irrelevant here; if there is a systemic issue, it has to be fixed, or it will recur. Just because air travel has been safer than other modes for some time now, we haven’t stopped trying to make it safer.
          The point of the Uber episode is that these are systems designed by humans, and humans can overlook important factors.
          (I could have included, why was the braking response conditioned on identification of an obstacle? ‘There’s some medium-sized object in my lane, but I don’t know what it is, so I’ll just drive into it …’)

  9. Bluto_Blutarski

    I can't help wondering whether this is less about the cars themselves and more about the kind of people who buy them. Which is to say, people who want ot make the Incel King even richer than he already is are the kind of people who drive without regard for anyone else, have short attention spans, and are generally just not smart enough to be in charge of heavy machinery.

    1. Doctor Jay

      Well, I know a few people who own Teslas, and your comment is pretty much a slander of them, it's so far from true.

      1. HokieAnnie

        I know folks who bought before it became apparent how much of a douche Musk was. Some have told me their next car won't be a tesla.

  10. Solar

    "But in neither case do we know how many miles these cars have driven."

    While the graphs presented by the NHTSA are indeed kind of useless, the report does include the spreadsheets with all the data, including car's VINs and mileage.

    So for example, for the level 2 ADS. Tesla had a total of 181 cars involved in crashes, and those 181 cars accounted for 281 separate crashes, with all but one car having multiple crashes.

    The average reported mileage of the cars at the time of a crash was 36,451 miles, with the soonest crash at 48 miles and the longest at 163,101 miles, but here things can also be misleading unless you keep track of each car.

    What seems concerning, is that regardless of the mileage when the first crash occurred, after that first crash, in nearly all cars, successive crashes occur at a much faster frequency (typically within a few hundred miles to a few thousand miles of the prior crash).

    For instance, one particular car was at 48,624 miles when the first crash occurred, but then after that, the same car crashed again at 50,762, 53,109, 53,744, 81,875 and 83,984 miles.

    For another car, the crashes occurred at 50762, then 51340, then 51915, then 52537, then 55046, and then 55672.

    Similar story for the rest, so once the ADS system fails, it seems that it is done for good and should no longer be used.

    1. jheartney

      Do we have any idea what constituted a "crash"? Unless these were extremely minor fender-benders, you'd think they wouldn't happen more than a few times before the car was taken out of service. Were these cars being driven as test beds in challenging environments? Were they driven by customers or by test personnel? Was the autonomy system at fault?

      As you noted, it's useless to talk about these numbers without an idea of how many miles the autonomy system was engaged, and there's no way to know that based on the bare mileage numbers for each car.

    2. KenSchulz

      I don’t know what you can conclude from the crash-interval data; it could reflect a change in the testing protocol.

  11. cephalopod

    As another driver on the road, it is impossible for me to know if the poor driving is due to user behavior or software. So we just call them all "Drunk Teslas."

  12. DFPaul

    I think we're learning that a side-effect of the decline of the journalism business is that much of journalism has devolved to: "Big Numbers! Scary!" Look at inflation for instance. It's a sad thing to watch.

  13. MontyTheClipArtMongoose

    If we get to 2025 & fully driverless vehicular operation is not readily available, I think Drum should be forced to tell why he really left Mother Jones.

    It's prolly WOKENESS, but...

  14. Yikes

    Musks nonsense doesn't help, and Tesla has absolutely the worst press coverage of any company by miles.

    As an owner, and an owner with the Self Driving Beta, I have looked into all thest stats.

    First off, Tesla only has 70% of the crashes because they have like 99% of the cars with the system. And since the system actually works, the miles driven by teslas are also off the charts. I mean, come on.

    Second, there is another statistic which someone alluded to, which is what is the standard for crashes per mile or deaths per mile, take you pick?

    Teslas are one of the safest, if not the safest cars you can buy. Sports cars like, say the Corvette, are by far the worst. Yes, that is because older, wealthier drivers drive more conservatively than younger, riskier drivers.

    Don't expect that stat to ever come up in a discussion like this.

    Finally, Teslas have what I am not sure any other car has, an almost airplane level black box system which allows NTSA, or any agency, to actually find out what happened in a given crash. Its up to countless times where press reports that self driving is some sort of cause and it turns out not to be true.

    Musk has almost personally botched the rollout of one of the most amazing technologies I have ever seen, including computers and cell phones. Its just astonishing.

    I will have one of the first cars, not at the moment, but in a few years, where the car will physically avoid almost all accidents. That will happen, developmentally, before it can really "drive itself."

    My wife backed into a car the other day, somehow igoring the beeps. It may be the last dent we have have to fix.

Comments are closed.