In 2009, Toyota accelerator pedals began mysteriously getting stuck, at times trapping panicked drivers in out-of-control cars that tragically crashed.
But was the problem really the car?
WATCH: Tesla Won't Stop Catching Fire Weeks After Crash
Itβs the subject of famed writer Malcolm Gladwellβs most recent βRevisionist Historyβ podcast, a series that explores the overlooked angles from some of historyβs biggest stories. Gladwellβs conclusion provides a poignant lesson for todayβs automakers and drivers β but unfortunately itβs one that often isnβt heard.
Hereβs the quick backstory on the βsticky pedalβ crisis: In 2009, a man phoned 911 to report that his accelerator pedal was stuck and he couldnβt get the car to stop. He said his brakes werenβt working. Ultimately, his car crashed into another and then plunged into a ravine. Everyone inside the vehicle was killed.
The 911 call went viral, and the scandal broke wide open. Over the next five years, an estimated 90 people died in Toyotas that mysteriously accelerated. Toyota recalled millions of vehicles but was accused of concealing information about the flawed pedals. In 2014, the company paid $1.2 billion to avoid prosecution for covering up information about problems with βunintended accelerationβ that the FBI said Toyota βknew was deadly.β
At the time, two theories emerged to explain why these pedals suddenly had minds of their own. One involved software malfunctions, while the other blamed floor mats that slid around and pinned the pedals down.
But according to Gladwell, the software explanation doesnβt hold up considering the fact that multiple tests have shown that even when a driver is pushing the throttle to the floor, hitting the brakes will stop the car.
And an investigation by the Department of Transportation in 2011 found that floor mats only accounted for a small fraction the accidents.
The real culprit? Human error. More often than not, drivers who reported that their accelerators were stuck were inadvertently flooring it and thinking they were pressing the brakes. Data from many of the βblack boxesβ from cars involved in incidents of unintended acceleration showed that in most cases, the brakes were never even touched.
The drivers were often in vehicles that were new or unfamiliar to them, or for whatever reason, they just got confused.
One of the more frustrating aspects of this whole fiasco was the mediaβs response. Instead of alerting drivers to the potential dangers of confusing the accelerator with the brake β which could happen to any of us β the focus was on Toyotaβs cover up, the scary and unpredictable software in cars, and of course, the floor mats.
This distinction matters a great deal as we head into an era of self-driving vehicles.
Just over a month ago, a man behind the wheel of a βself-drivingβ Tesla was killed when his car crashed into a tractor trailer. For many, the focus was understanding why the technology βfailed,β and thereβs speculation that the sunβs glare might have thrown it off.
But itβs critical to note how human error played a major role. According to Tesla, the car is designed for the driver to keep their hands on the wheel at all times, even though itβs equipped with systems to help the driver if something goes wrong. In this case, that didnβt happen.
In a way, it might seem like a situation with unavoidable consequences: automakers can beg drivers to pay attention in these βself-drivingβ cars as much as they want, but at the same time, these autopilot features will probably lull most drivers into a dangerous state of complacency.
Itβs an arrangement that feels destined for at least a few disasters. (Especially considering how same scenario has impacted aviation β like in the case of Flight 447, which tragically crashed into the Atlantic Ocean on route from Rio de Janeiro to Paris.)
This is why an open and honest conversation about the role of human error is so vital. Too often, drivers are expecting car manufacturers and vehicles to make it easy. But as cars and automakers are get smarter, drivers have to wise up as well.
Last month, another driver using Teslaβs autopilot feature crashed.
According to a report of the incident, the car alerted the driver to take the wheel due to uncertain road conditions but he didnβt β luckily no one was injured. A trooper on the scene decided not to cite the driver, who blamed the accident on the car. And now the feds are launching an investigation on the limits of the technology.
Hopefully theyβll be factoring in the biggest limitation: humans.
Until we understand the new technology in cars and stop taking it for granted, humans canβt be expected to handle it safely. And that means an honest discussion about why we make errors and for whatever reason, donβt put our hands on the wheel when we should.