The 2009 Toyota Accelerator Scandal That Wasn’t What It Seemed

And why it matters for understanding our rocky relationship with today’s autonomous vehicles.

(AP Photo)
(AP Photo)

In 2009, Toyota accelerator pedals began mysteriously getting stuck, at times trapping panicked drivers in out-of-control cars that tragically crashed.

But was the problem really the car?

It’s the subject of famed writer Malcolm Gladwell’s most recent “Revisionist History” podcast, a series that explores the overlooked angles from some of history’s biggest stories. Gladwell’s conclusion provides a poignant lesson for today’s automakers and drivers — but unfortunately it’s one that often isn’t heard.

Here’s the quick backstory on the “sticky pedal” crisis: In 2009, a man phoned 911 to report that his accelerator pedal was stuck and he couldn’t get the car to stop. He said his brakes weren’t working. Ultimately, his car crashed into another and then plunged into a ravine. Everyone inside the vehicle was killed.

The 911 call went viral, and the scandal broke wide open. Over the next five years, an estimated 90 people died in Toyotas that mysteriously accelerated. Toyota recalled millions of vehicles but was accused of concealing information about the flawed pedals. In 2014, the company paid $1.2 billion to avoid prosecution for covering up information about problems with “unintended acceleration” that the FBI said Toyota “knew was deadly.” 

 

 

At the time, two theories emerged to explain why these pedals suddenly had minds of their own. One involved software malfunctions, while the other blamed floor mats that slid around and pinned the pedals down.

But according to Gladwell, the software explanation doesn’t hold up considering the fact that multiple tests have shown that even when a driver is pushing the throttle to the floor, hitting the brakes will stop the car.

And an investigation by the Department of Transportation in 2011 found that floor mats only accounted for a small fraction the accidents.

The real culprit? Human error. More often than not, drivers who reported that their accelerators were stuck were inadvertently flooring it and thinking they were pressing the brakes. Data from many of the “black boxes” from cars involved in incidents of unintended acceleration showed that in most cases, the brakes were never even touched. 

The drivers were often in vehicles that were new or unfamiliar to them, or for whatever reason, they just got confused.

(AP Photo)(AP Photo)

One of the more frustrating aspects of this whole fiasco was the media’s response. Instead of alerting drivers to the potential dangers of confusing the accelerator with the brake — which could happen to any of us — the focus was on Toyota’s cover up, the scary and unpredictable software in cars, and of course, the floor mats.

This distinction matters a great deal as we head into an era of self-driving vehicles.

Just over a month ago, a man behind the wheel of a “self-driving” Tesla was killed when his car crashed into a tractor trailer. For many, the focus was understanding why the technology “failed,” and there’s speculation that the sun’s glare might have thrown it off.

But it’s critical to note how human error played a major role. According to Tesla, the car is designed for the driver to keep their hands on the wheel at all times, even though it’s equipped with systems to help the driver if something goes wrong. In this case, that didn’t happen.

In a way, it might seem like a situation with unavoidable consequences: automakers can beg drivers to pay attention in these “self-driving” cars as much as they want, but at the same time, these autopilot features will probably lull most drivers into a dangerous state of complacency.

It’s an arrangement that feels destined for at least a few disasters. (Especially considering how same scenario has impacted aviation — like in the case of Flight 447, which tragically crashed into the Atlantic Ocean on route from Rio de Janeiro to Paris.)

This is why an open and honest conversation about the role of human error is so vital. Too often, drivers are expecting car manufacturers and vehicles to make it easy. But as cars and automakers are get smarter, drivers have to wise up as well.

(Image credit: Tesla Motors)(Image credit: Tesla Motors)

Last month, another driver using Tesla’s autopilot feature crashed.

According to a report of the incident, the car alerted the driver to take the wheel due to uncertain road conditions but he didn’t — luckily no one was injured. A trooper on the scene decided not to cite the driver, who blamed the accident on the car. And now the feds are launching an investigation on the limits of the technology.

Hopefully they’ll be factoring in the biggest limitation: humans.

Until we understand the new technology in cars and stop taking it for granted, humans can’t be expected to handle it safely. And that means an honest discussion about why we make errors and for whatever reason, don’t put our hands on the wheel when we should. 

More in Automotive