What Insurers Can Learn From the Aftermath of the Boeing 737 Max Crashes

Back in the spring, I blogged about the issues with the Boeing 737 Max. If you remember, the 737 Max has an automated system call MCAS. The new plane was built with slightly larger engines (to get more people on the plane) that were slightly further back on the body. This changed how the plane flew, favoring the plane flying nose up in manual mode. MCAS pushes the nose back down. MCAS depends on a single sensor; if that sensor has an issue, the MCAS will malfunction. This happened twice, causing the plane to crash, and all lives to be lost on those planes.

In classic software engineering style that any insurer CIO would recognize, the plane introduced feature creep. It added new software to compensate for “minor” changes to the plane. Then, the introduction of those new features was clearly not managed correctly. Quality assurance around integration testing between the new features and the old features (measuring altitude, climb, wind speed, etc.) was not fully assessed or documented. To top it off, the users (in this case, the pilots) were initially not properly trained on the features. Sounds like some core system projects I have heard of.

Well, the plane is still not flying—and it is not expected to go back into service until sometime in 2020. Two senators are proposing regulations that would mandate safety features that Boeing ignored. These mandates would be placed on the FAA, which clearly trusted Boeing too much during the certification process. The Boeing CEO was due to testify to a committee in Congress this week and has been stripped of his chairman title, while the person who worked for the CEO and ran the Commercial Airplanes Division was fired. The bill will also address creating an FAA Center of Excellence to study flight automation and human factors in commercial aircraft.

Two other lessons come to mind that are applicable to insurance. As we create more highly automated processes around underwriting and claims, the human factors need to be understood and incorporated into the automated processes and machine-learning algorithms that drive them. And finally, the CEO is always accountable for the final outcome, in any industry—including insurance.

Add new comment

CAPTCHA
This question is for testing whether or not you are a human visitor and to prevent automated spam submissions.
1 + 5 =
Solve this simple math problem and enter the result. E.g. for 1+3, enter 4.

How can we help?

If you have a question specific to your industry, speak with an expert.  Call us today to learn about the benefits of becoming a client.

Talk to an Expert

Receive email updates relevant to you.  Subscribe to entire practices or to selected topics within
practices.

Get Email Updates