ADVERTISEMENT

Four Seconds to Respond? Faulty Assumptions Led to 737 Disasters

The story of missteps in the plane’s design emerged from hundreds of pages of dry, technical language in the Indonesian report.

Four Seconds to Respond? Faulty Assumptions Led to 737 Disasters
Boeing Co. 737 Max airplanes are seen at the company’s manufacturing facility in Renton. (Photographer: David Ryder/Bloomberg)

(Bloomberg) -- The flight-control feature implicated in two fatal crashes on Boeing Co.’s 737 Max was built on a foundation of false assumptions.

Boeing underestimated its risks, didn’t consider how changes to the system would heighten the danger, and kept some of the government regulators overseeing the plane’s design in the dark, a report by Indonesian investigators concluded.

Four Seconds to Respond? Faulty Assumptions Led to 737 Disasters

The most comprehensive review to date of the Maneuvering Characteristics Augmentation System also found that the U.S. Federal Aviation Administration delegated too much authority to Boeing for its approval.

The designers and the regulators also used unrealistic assumptions about how pilots would behave, the report concluded, giving them just four seconds to diagnose and react when the unfamiliar system fired, for example. Moreover, the investigators said, current regulations don’t require the potential for human failure to be considered as manufacturers calculate the probability of an aircraft system failure.

“The aircraft design should not have allowed this situation,” Indonesia’s National Transportation Safety Committee said in a sweeping 322-page report released Friday.

The NTSC issued its conclusions and 25 recommendations to local regulators, the airline, Boeing and the FAA, almost one year after a Lion Air 737 Max dove at high speed into the Java Sea, minutes after takeoff from Jakarta on Oct. 29, prompted by a malfunction with MCAS that repeatedly commanded the plane to dive.

Four Seconds to Respond? Faulty Assumptions Led to 737 Disasters

Separately, Ethiopian authorities are preparing a report on a March 10 crash near Addis Ababa of a 737 Max. The two accidents killed 346 people and led to the worldwide grounding of the plane. Boeing is still finalizing fixes to the system, which it’s hoping to complete by the end of the year.

“We are addressing the KNKT’s safety recommendations, and taking actions to enhance the safety of the 737 MAX to prevent the flight control conditions that occurred in this accident from ever happening again,” said Boeing Chief Executive Officer Dennis Muilenburg, using the Indonesian initials for the investigative agency. “Safety is an enduring value for everyone at Boeing and the safety of the flying public, our customers, and the crews aboard our airplanes is always our top priority.”

The FAA said in a statement that it welcomed the Indonesian recommendations and will consider them as it assesses changes to the 737 Max. “The FAA is committed to ensuring that the lessons learned from the losses of Lion Air Flight 610 and Ethiopian Airlines Flight 302 will result in an even greater level of safety globally,” the agency said in the statement.

The story of missteps in the plane’s design emerged from hundreds of pages of dry, technical language in the Indonesian report, some of which was prepared with the help of the U.S. National Transportation Safety Board. While at times harsh in its assessment, it contained no hint as to why the U.S. Justice Department is conducting a criminal probe of the plane’s approval.

At the center of the inquiry is MCAS, an automated system that pilots were never told about. Early in 2012, five years before the plane was certified by FAA, engineers determined that the redesigned version of the 737 was prone to nosing up in certain conditions. That could lead to a dangerous aerodynamic stall and loss of control.

Four Seconds to Respond? Faulty Assumptions Led to 737 Disasters

So the Chicago-based plane-maker’s engineers developed MCAS. It functioned by moving a wing-like device at the tail known as a horizontal stabilizer to ensure the nose didn’t get too high. The stabilizer can swivel in flight and is routinely adjusted to raise and lower the nose.

Initially, MCAS was designed to operate only at high speeds and was limited to moving the stabilizer 0.6 degrees, a fraction of its reach. The FAA office that oversees Boeing designs signed off on it.

Boeing assumed that the system presented a “major” safety risk, or two steps below the most severe ranking. That meant that it didn’t have to conduct an extensive analysis that looked at the complex details of how a failure would occur.

Pilots tested it in a flight simulator, but without the real-life scenarios that occurred in each accident.

Recognizable Failure

Boeing and FAA assumed that a failure would be “readily recognizable” by pilots and easily counteracted -- a deadly miscalculation that proved untrue in both accidents. On an earlier Lion Air flight with the same malfunction, it took several minutes and the help of a third pilot in the cockpit before the crew responded properly, the report said.

Those assumptions became exacerbated by the next step in the design.

Boeing determined that there was also a need for the system at slower speeds, and expanded MCAS’s functionality. In the new design, it could move as much as 2.5 degrees. It was at this point that several critical stumbles occurred, the NTSC report concluded.

Boeing categorized the risks of it malfunctioning at slow speeds as “minor.” That meant that no new safety analyses or flight-simulator tests needed to be done, and the FAA allowed the company to use its own engineers to sign off on the changes. The FAA routinely deputizes employees of Boeing and other manufacturers to sign off on designs, but it’s supposed to retain more direct control over safety-critical decisions.

Oversee Boeing

While the company explained the changes to some FAA officials, it didn’t tell those directly overseeing flight-control systems, the report said.

“Boeing did not submit the required documentation and the FAA did not sufficiently oversee Boeing,” the report said.

The increased movement commanded by MCAS proved a fierce and confounding problem during the accidents, especially as it activated repeatedly. Even though flight-control systems are supposed to be easily counteracted in a failure, the new, expanded MCAS was more difficult to control during a malfunction.

And the confusing and loud cockpit environment made it difficult for pilots to diagnose, exacerbating the problem, investigators found.

Boeing engineers actually considered whether these multiple MCAS activations required them to redesign the system, according to a submission to the Indonesian team by the NTSB. Again, the company dismissed the need, assuming pilots would be able to respond.

Pilot Response

One of the underlying assumptions was that pilots would recognize a failure within one second and react within three more seconds. That proved wildly optimistic.

In a Lion Air flight one day before the crash, pilots took three minutes and 40 seconds to recognize what was happening and to halt MCAS. The crew that crashed never figured it out.

Even the interim step that the Lion Air crew in the accident took to partially mitigate MCAS -- adjusting the horizontal stabilizer manually using a switch on the control column -- took as long as eight seconds to perform, the report said. And the crews involved in the accident seemed unable to do it properly, it said.

Part of the reason for that was Boeing, assuming the risks of an MCAS failure had been adequately addressed, didn’t include any mention of the system in flight manuals. Doing so might have required costly additional pilot training. Recently released text messages by a Boeing pilot indicated the company was under intense pressure not to add new training.

Discussions with FAA officials about whether to include MCAS in the manuals didn’t consider the scenarios the crews encountered in the accidents, the report said.

One of the most sweeping findings could rock the entire aviation manufacturing sector.

Current regulations, the report said, don’t require the potential for human failure to be considered as manufacturers calculated the probability of an aircraft system failure -- even if the humans at the controls are supposed to rectify the problem.

The NTSC recommended that the FAA work with other regulators around the world to review its assumptions on pilot behavior, following a similar set of recommendations issued by the U.S. NTSB.

To contact the reporter on this story: Alan Levin in Washington at alevin24@bloomberg.net

To contact the editors responsible for this story: Jon Morgan at jmorgan97@bloomberg.net, Ros Krasny

©2019 Bloomberg L.P.