ADVERTISEMENT

Tesla Model X in California Crash Sped Up Prior to Impact

Tesla Model X crash: car clocked 71 miles an hour before collision

Tesla Model X in California Crash Sped Up Prior to Impact
A driver prepares to connect a plug to a Tesla Inc. Model X electric automobile at a charging point in a parking lot in Frankfurt, Germany. (Photographer: Alex Kraus/Bloomberg)

(Bloomberg) -- The Tesla Inc. Model X that crashed in California earlier this year while being guided by its semi-autonomous driving system sped up to 71 miles an hour in the seconds before the vehicle slammed into a highway barrier, investigators said Thursday.

A U.S. National Transportation Safety Board preliminary report on the March 23 accident in Mountain View raises new questions about the capabilities of Tesla’s semi-autonomous driving system and the actions of the driver. His hands were detected on the steering wheel only 34 seconds during the last minute before impact and he had programmed the car to drive at 75 mph, the report said.

Tesla Model X in California Crash Sped Up Prior to Impact

The investigation is the latest to shine a spotlight into potential flaws in emerging autonomous driving technology. In Tesla’s case, the company has touted its system as having self-driving capabilities, even though vehicles have failed to stop for stationary objects in the road in several cases and owners are warned to remain attentive.

Another NTSB probe of a self-driving Uber Technologies Inc. car that killed a pedestrian March 18 in Arizona found that the car’s sensors picked up the victim, but the vehicle wasn’t programmed to brake for obstructions.

Walter Huang, a 38-year-old engineer who worked at Apple Inc., died in the March 23 crash when his Model X struck the barrier as he was using the driver-assistance system known as Autopilot. The car’s computer didn’t sense his hands on the steering wheel for six seconds before the collision, according to NTSB.

Tesla shares, which had been up 3.3 percent, fell during the day. The shares closed down 1.09 percent to $316.02 in New York trading.

The preliminary report didn’t include conclusions about what caused the crash. “All aspects of the crash remain under investigation as the NTSB determines the probable cause, with the intent of issuing safety recommendations to prevent similar crashes,” the report said.

A Tesla spokeswoman declined to comment on the NTSB’s report and pointed to a March 30 company blog post. In that post, the company said the driver had about five seconds and 150 meters of unobstructed view of the highway barrier but took no action to avoid the collision, citing vehicle logs.

“Tesla Autopilot does not prevent all accidents -- such a standard would be impossible -- but it makes them much less likely to occur,” the company wrote in the post. “It unequivocally makes the world safer for the vehicle occupants, pedestrians and cyclists.”

An attorney hired by Huang’s family, Mark Fong, said the NTSB report appears to contradict Tesla’s earlier characterization of the accident and bolsters the family’s view that the car’s systems failed. “The Autopilot system should never have caused this to happen,” said Fong of Minami Tamaki LLP.

Huang was using Tesla’s Autopilot system continuously for nearly 19 minutes before the accident. The system made two visual and one auditory alert for the driver to place his hands on the steering wheel, but those occurred more than 15 minutes before the crash, according to the report.

The NTSB didn’t report any alerts in the moments leading up to the crash.

The Tesla was following a lead vehicle at about 65 miles per hour 8 seconds before the crash. A second later, the car began to steer left while still following the lead vehicle. Four seconds before the crash it was no longer following the lead vehicle, the NTSB said.

The Model X then accelerated from 62 mph (99 kilometers per hour) to 70.8 mph in the final three seconds before impact. The Autopilot’s cruise control system, which is designed to match the speed of a slower vehicle ahead of it, was set at 75 mph.

The Tesla collided with a so-called crash attenuator, a device covering the concrete barrier that’s designed to absorb a vehicle impact to lower risks of damage and injuries. The attenuator had been damaged 11 days earlier in a previous accident and hadn’t been repaired, according to NTSB. The barrier is in the median of the highway where it splits into two different directions.

No pre-crash braking or evasive steering movement was detected, according to NTSB’s summary of performance data recorded by the car.

Huang, found belted in his seat after the crash, was removed by bystanders before being taken to a nearby hospital, where he died from his injuries. The impact was so violent it tore off the front section of the vehicle.

While Tesla tells drivers they must keep their hands on the steering wheel and monitor the semi-autonomous system, the car can follow traffic, steer and control speed in some situations.

The NTSB originally announced it was looking into a fire that erupted in the car’s battery, which was damaged in the impact. The agency is also investigating a fire in a fatal Tesla crash on May 8 in Fort Lauderdale, Florida.

The Model X’s battery pack was sheared open by the crash and erupted in flames, which firefighters extinguished with about 200 gallons of water and foam. The battery reignited on March 28 in an impound lot, NTSB said. The NTSB has examined the risks of battery fires for more than a decade.

Investigators expanded their probe in the Mountain View crash to include the the vehicle’s automation after the company revealed the Autopilot system was switched on.

Two consumer advocacy groups charged May 23 that Tesla’s promotional material on Autopilot are deceptive. A Tesla website says its vehicles have “full self-driving hardware.” The site also contains a video of a car navigating streets without human input with text saying “the car is driving itself.”

In Tesla’s May earnings call, Chief Executive Officer Elon Musk dismissed the notion that Autopilot users involved in accidents have the mistaken belief that the system is capable of fully-autonomous driving. Driver complacency is more of a challenge, he said.

“When there is a serious accident, almost always -- in fact, maybe always -- the case that it is an experienced user and the issue is more one of complacency. Like, they get too used to it,” Musk said said on a conference call with analysts. “That tends to be more of an issue. It is not a lack of understanding of what Autopilot can do. It’s actually thinking they know more about autopilot than they do.”

Tensions in the NTSB Mountain View inquiry boiled over on April 11 when Tesla released information about the accident without first clearing it with investigators, prompting the agency to take the unusual action of removing the car company from official participation.

Tesla had issued comments blaming the driver of the Tesla SUV. “While we understand the demand for information that parties face during an NTSB investigation, uncoordinated releases of incomplete information do not further transportation safety or serve the public interest,” NTSB Chairman Robert Sumwalt said in a statement.

Musk hung up on Sumwalt as he explained the removal, according to the NTSB chief.

While the action was a rebuke to the company and cuts it out of the information loop on some matters, the NTSB retains legal authority to obtain information from Tesla engineers about how its car performed in the accident.

Issues that arose last year in a separate NTSB investigation of a Tesla accident are likely to be raised again in the latest case.

The NTSB found in September that Tesla’s design of its Autopilot contributed to a 2016 fatal crash in Florida. An Ohio man had driven for 37 minutes before the accident while only occasionally putting his hands on the steering wheel. Sensors on Tesla cars, which help it stay within highway lanes and brake when a car stops ahead, weren’t designed to see the large truck, the NTSB found.

The board recommended that regulators find better ways to measure driver attentiveness, such as using scanners that monitor where a person’s eyes are looking. While Tesla updated its vehicle’s software to make it more difficult to drive without hands on the wheel, it has rejected this scanning technology.

The safety board in the earlier accident said the drivers of both the truck and the Tesla were also partly responsible for the crash.

The NTSB is also investigating two other incidents involving Teslas: An accident near Los Angeles in January in which a Model S struck a fire truck parked on a freeway while the car was on Autopilot and a Model X that crashed into a garage in August in Lake Forest, California, and its battery caught fire.

The National Highway Traffic Safety Administration is also investigating the case of a Model S driven by a 28-year-old woman that struck a stopped fire truck on a South Jordan, Utah, roadway on May 11. Autopilot was engaged and didn’t brake for the truck as the driver was looking at her phone, according to police.

--With assistance from Susan Decker and Dana Hull.

To contact the reporters on this story: Alan Levin in Washington at alevin24@bloomberg.net;Ryan Beene in Washington at rbeene@bloomberg.net

To contact the editors responsible for this story: Jon Morgan at jmorgan97@bloomberg.net, Elizabeth Wasserman, Justin Blum

©2018 Bloomberg L.P.