I was stuck by an extraordinary above the fold article in the Sunday New York Times (NYT), entitled The Late Change, and Fatal Flaws in Boeing’s Plane by a plethora of reporters including Jack Nicas, Natalie Kitroeff, David Gelles and James Glanz. (The physical location of the article in the print edition was also significant as it reminded me of when the NYT broke the story of Wal-Mart’s corruption allegations in Mexico, in the same place, right hand column above the fold in an edition of the Sunday Times back in 2012). Matt Kelly wrote a great blog post on the article and his interpretation of it in Radical Compliance, entitled Another Lesson from Boeing: Silos. Kelly was spot on regarding his analysis of the siloed nature of Boeing’s design and construction process that caused or contributed to the catastrophic failure of the 737 Max due to the failure of the Maneuvering Characteristics Augmentation System (MCAS).

Kelly’s excellent piece frees me up to explore some of the other contributing factors for the failure of the MCAS. The over-riding theme of the article, which struck me so resonantly, was the number of compliance miss-steps that led to the disaster. While the siloed nature of Boeing’s process led to a literal number of very small steps which contributed to the final disaster, it demonstrated to me even more clearly why compliance must not only have a seat the table but also be embedded throughout your organization. For just as the siloed nature caused the pilots to not know what the engineers were designing to the software folks not appreciating how their changes were communicated to the employees who were responsible for assessing their safety impact to those tasked with obtaining regulatory approvals. The clear lesson for a Chief Compliance Officer (CCO) is that you must have visibility over your corporate operations to see how they all fit together.

Yet there were several other key factors which came into play. The first was that the plane had to handle smoothly. That was the criticism of the test pilots for the 737 Max. To facilitate the plane’s handling the MCAS, engineers changed this by moving from two sensors before the computer kicked in and took over the plane in certain circumstances, to only one. The two-sensor array for the MCAS had been approved by the Federal Aviation Authority (FAA), which apparently was not informed of this change down to one sensor as Boeing engineers assigned this change the risk category of hazardous and not catastrophic if there was some type of failure of the sensor or plane. Yet a major deficiency had been created by moving to one external sensor only, as the risk of this sensor failure leading to a safety failure was increased and the risk that an external event such as weather or a bird hitting the sensor and damaging/destroying it became much greater. What started out as an ease of use for pilots issue, morphed into reducing the safe operation of the aircraft.

The next decision was one around cost cutting and it now appears to have been critical. Boeing did not want to “spend millions of dollars on additional training” which it would have been required to do so if the changes in the prior design of the 737 changed significantly. So this change was not communicated to the FAA. Even worse for the pilots flying the plane, Boeing asked for and received approval from the FAA to remove information from the Pilot’s Manual on the MCAS system. As the NYT piece stated, “Under the impression that the system was relatively benign and rarely used, the FAA approved” Boeing’s request. Now pilots who even had the time to consult the pilot’s manual could not get any answers. This move to cut the potential costs, by (1) removing MCAS from the Pilot’s Manual and (2) not training pilots on the revised MACS system, created the situation where pilots who were given incorrect information from the MACS sensors had not received training on how to correct the situation and could not look it up even if they had the time to do so.

All of these mistakes were compounded further by Boeing after the first of the two crashes, yet the company still defended the single sensor array for the MCAS. As reported, “At a tense meeting with the pilots’ union at American Airlines in November, Boeing executives dismissed concerns. “It’s been reported that it’s a single point failure, but it is not considered by design or certification a single point,” said Mike Sinnett, a Boeing vice president, according to a recording of the meeting. His reasoning? The pilots were the backup. “Because the function and the trained pilot work side by side and are part of the system,” he said.” In the face of a demonstrated safety problem, Boeing refused to acknowledge its error.

In Kelly’s blog post he focused on creating visibility across siloes and fuller communications across corporate disciplines. These are both excellent points. I think that other compliance lessons learned should also be considered. Start with risk; not simply the risk of a catastrophic safety failure but the risk when people are pushed too fast and too hard to meet deadlines and cut costs. That clearly was a risk which came to fruition in this case. If your organization has high pressure to meet a project goal or rapidly increase sales, are the risks increased of a compliance failure leading to a legal violation? Have you trained your personnel on what to do when such risks arise? Do you have a speak up culture which encourages employees to raise these risks for review and oversight?

There remains much more to learn from the Boeing 737 Max imbroglio. Every compliance professional needs to follow this story to see what lessons you can not only learn but impart to your organization.

This publication contains general information only and is based on the experiences and research of the author. The author is not, by means of this publication, rendering business, legal advice, or other professional advice or services. This publication is not a substitute for such legal advice or services, nor should it be used as a basis for any decision or action that may affect your business. Before making any decision or taking any action that may affect your business, you should consult a qualified legal advisor. The author, his affiliates, and related entities shall not be responsible for any loss sustained by any person or entity that relies on this publication. The Author gives his permission to link, post, distribute, or reference this article for any lawful purpose, provided attribution is made to the author. The author can be reached at tfox@tfoxlaw.com.

© Thomas R. Fox, 2019

0 comments