WASHINGTON (AP) — An inattentive driver using his Tesla Model S's semi-autonomous driving system and a truck driver who made a left-hand turn in front of the car are both to blame for a fatal crash last year, the National Transportation Safety Board said Tuesday.
The board also recommended automakers incorporate safeguards that limit the use of automated vehicle control systems to the conditions for which they were designed. Joshua Brown, 40, of Canton, Ohio, was traveling on a divided highway near Gainesville, Florida, using the Tesla's automated driving systems when he was killed.
Tesla had told Model S owners the automated systems should only be used on limited-access highways, which are primarily interstates. But it didn't incorporate protections against their use on other types of roads, the board found. Despite upgrades since the May 2016 crash, Tesla has still not incorporated such protections, NTSB chairman Robert Sumwalt said.
"In this crash, Tesla's system worked as designed, but it was designed to perform limited tasks in a limited range of environments," Sumwalt said. "Tesla allowed the driver to use the system outside of the environment for which it was designed."
The result, he said, was a collision "that should never have happened."
In a statement, Tesla said "we appreciate the NTSB's analysis of last year's tragic accident and we will evaluate their recommendations as we continue to evolve our technology." The company added that overall its automated driving systems, called Autopilot, improve safety.
NTSB directed its recommendations to all automakers, rather than just Tesla, saying the oversight is an industrywide problem. Manufacturers should be able to use GPS mapping systems to create such safeguards, Sumwalt said.
Manufacturers should also develop systems for ensuring operators remain attentive to the vehicle's performance when using semi-autonomous driving systems other than detecting the pressure of hands on the steering wheeling, the NTSB recommended. Brown had his hands on the sedan's steering wheel for only 25 seconds out of the 37.5 minutes the vehicle's cruise control and lane-keeping systems were in use prior to the crash, investigators found.
As a consequence, Brown's attention wandered and he didn't detect the semitrailer in his path, they said.
The collision is the first known fatal crash of a highway vehicle operating under automated control systems, according to the NTSB.
The Model S is a level 2 on a self-driving scale of 0 to 5. Level 5 vehicles can operate autonomously in nearly all circumstances. Level 2 automation systems are generally limited to use on interstate highways. Drivers are supposed to continuously monitor vehicle performance and be ready to take control if necessary.
Investigators also found that the sedan's cameras and radar weren't capable of detecting a vehicle turning into its path. Rather, the systems are designed to detect vehicles they are following to prevent rear-end collisions. The board repeated re-issued previous recommendations that the government require all new cars and trucks to be equipped with technology that wirelessly transmits the vehicles' location, speed, heading and other information to other vehicles in order to prevent collisions.
Brown's family defended his actions and Tesla in a statement released Monday. "There was a small window of time when neither Joshua nor the Tesla features noticed the truck making the left-hand turn in front of the car," the statement said. "People die every day in car accidents. Many of those are caused by lack of attention or inability to see the danger."
Brown was a technology geek and enthusiastic fan of the Model S who posted videos about the car and spoke to gatherings at Tesla stores. "Nobody wants tragedy to touch their family, but expecting to identify all limitations of an emerging technology and expecting perfection is not feasible either," the statement said.
The National Highway Traffic System Administration, which regulates auto safety, declined this year to issue a recall or fine Tesla as a result of the crash, but it warned automakers they aren't to treat semiautonomous cars as if they were fully self-driving.
Follow Joan Lowy on Twitter @AP_Joan_Lowy