ADVERTISEMENT

Tesla Autopilot Said by NTSB Staff to Share Blame for 2016 Crash

In 2016, Tesla’s self driven car drove itself into the side of the truck. And the autopilot has a share in the blame.

Tesla Autopilot Said by NTSB Staff to Share Blame for 2016 Crash
Customers look at a Tesla Inc. Model S 90D electric vehicles parked at the company’s charging station in Hanam, Gyeonggi Province, South Korea. (Photographer: SeongJoon Cho/Bloomberg)

(Bloomberg) -- Federal accident investigators are poised to find that Tesla Inc.’s auto-driving system should share blame for a fatal 2016 crash in which a Model S sedan drove itself into the side of a truck.

The investigative staff of U.S. National Transportation Safety Board, in its first probe of the wave of autonomous driving systems being introduced by carmakers, has recommended that Tesla’s Autopilot system be declared a contributing factor in the crash because it allowed the driver to go for long periods without steering or apparently even looking at the road, according to a person briefed on the findings.

The safety board’s findings and recommendations could have broad implications for how self-driving technology is phased in on vehicles and trucks, and it comes as Congress is debating legislation to spur autonomous vehicle systems. Tech and auto companies are pouring billions of dollars into a race to develop self-driving vehicles, which carmakers from Tesla to Volvo Cars say could be deployed in less than 10 years.

The NTSB is meeting Tuesday to issue its final findings on the crash and the conclusions are subject to revision by its board members. The staff has recommended a finding that Tesla’s automation allowed Joshua Brown to effectively let the car drive itself even though the manufacturer had warned customers they weren’t allowed to do so, the person said.

Brown, a former Navy SEAL, died when his Model S struck the truck at an intersection. The truck driver’s failure to yield and Brown’s inattention to the highway were the primary reasons for the collision, the NTSB staff concluded in its draft report, according to the person briefed on it.

Such automation has on the whole reduced accidents, but overly complex systems or poor understanding of how they work can lead to problems. In multiple aviation accidents, such as the 2013 crash of an Asiana Airlines Inc. jetliner in San Francisco, the safety board has found automation systems were at least partly to blame.

Brown, who “loved technology,” believed the Tesla automation has saved lives, according to a statement released by his family on Monday through their attorneys. “We heard numerous times that the car killed our son,” said the statement issued by the law firm Landskroner Grieco Merriman LLC. “That is simply not the case.”

The statement also praised Tesla for improving its Autopilot software after the accident, changes it said were a direct result of the crash.

Tesla didn’t provide an immediate comment on the draft conclusions. The company said in a statement last year that customers had to acknowledge Autopilot’s limitations before it would allow the systems to operate. Every time the system is engaged, it reminds drivers: “Always keep your hands on the wheel. Be prepared to take over at any time.”

Brown, 40, who lived in Ohio, was driving near Williston, Florida, on May 7, 2016, when his car struck the side of a truck trailer that was making a left turn in front of him. There was no evidence that his car attempted to slow down or that he made any evasive maneuvers, according to previously released NTSB data.

Of the last 41 minutes of his trip, the car’s cruise control and auto-steering systems were engaged for more than 37 minutes, according to previously released data from NTSB. Brown’s hands were detected on the steering wheel just twice during that time for brief periods of 20-30 seconds, according to the agency.

While the car gave Brown a visual warning to "Hold Steering Wheel" on seven occasions, six of which were followed by a sound warning as well, the car permitted him to keep driving, according to NTSB records.

Four months after the accident, Tesla released a new version of Autopilot that makes it harder for drivers to ignore warnings to put their hands on the steering wheel. The car will automatically stop after a driver ignores the warnings and the auto-steering function can’t be reengaged until the car has been parked.

The company also altered how its system detected potential obstructions ahead, such as the white side of the tractor trailer that it couldn’t distinguish against a bright sky. The newer version emphasized a radar system for scanning ahead rather than camera sensors, according to a company statement.

The National Highway Traffic Safety Administration conducted a separate investigation and concluded in January that it didn’t find a defect in the Tesla design that warranted a recall.

"A safety-related defect trend has not been identified at this time and further examination of this issue does not appear to be warranted," the highway safety agency said in a release. 

A class-action suit was filed against Tesla in April alleging that the Autopilot software is “dangerously defective” when engaged. The cars sometimes veer out of lanes, jam on the brakes for no reason or fail to stop when approaching other vehicles, the suit filed in California claimed.

Tesla responded that it never claimed its vehicles are equipped with “full self-driving capability," said the suit misrepresented facts and called the action “a disingenuous attempt to secure attorney’s fees.”

--With assistance from Dana Hull

To contact the reporter on this story: Alan Levin in Washington at alevin24@bloomberg.net.

To contact the editors responsible for this story: Jon Morgan at jmorgan97@bloomberg.net, Elizabeth Wasserman