The widespread outrage at Boeing’s just-revealed internal memos is wholly justified. The very idea that employees themselves were aware of problems in flight training for the 737 MAX and let it go anyway is a moral outrage. They whispered to each other that they would never let their families fly on that. One said that the plane was “designed by clowns, who are in turn supervised by monkeys.”
Some of these people were presumably in a position to blow the whistle. They didn’t. Company culture didn’t encourage that. Bosses didn’t want to hear it. Or something. That’s a damning indictment of a top-of-the-heap company in which hundreds of millions of passengers, crews, and pilots entrust their lives. Mistakes are tolerable. But this seems to approach the level of a known coverup that ended in terrible death.
There are, however, complications to this clean picture of corporate malfeasance and regulatory negligence. Airline manufacturers do not want their planes to fall from the skies due to bad design or poor training. It’s economically irrational. Every institutional incentive of a for-profit company, supervised by insurers and scrutinized constantly by consumers and regulators, would seem to weigh against letting known risks be swept under the carpet.
The problem, in this case, was regulatory cost avoidance. Boeing didn’t want to spend the time and money retraining pilots as would have been required for every new plane – that is, complying with regulatory strictures. Part of the problem, it appears, is that Boeing passed off their new plane – vast differences in design from its supposed predecessor – as merely an updated version of the old one.
As the New York Times tells us, “Several employees seemed consumed with limiting training for airline crews to fly the plane, a significant victory for Boeing that would benefit the company financially.” As for the 737 MAX just being a 3.0 of the 737, the Times further reports:
The 737 Max is a legacy of its past, built on decades-old systems, many that date back to the original version. The strategy, to keep updating the plane rather than starting from scratch, offered competitive advantages. Pilots were comfortable flying it, while airlines didn’t have to invest in costly new training for their pilots and mechanics. For Boeing, it was also faster and cheaper to redesign and recertify than starting anew…. Boeing avoided overhauling the jet in order to appease airlines, according to current and former Boeing executives, pilots and engineers, some of whom spoke on the condition of anonymity because of the open investigations. Airlines wanted new 737s to match their predecessors so pilots could skip expensive training in flight simulators and easily transition to new jets.
Which is to say that Boeing was eschewing the pilot-training mandates, and hoping that the responsibility for doing so would fall not to the company but to the regulators whose actions they largely but not entirely controlled through regulatory capture. After all, it’s the government in charge of certifying safety; if the bureaucrats say it is safe, surely it is.
Such a system is easily gamed.
What this appears to be is a diffusion of responsibility that is at least in part due to broken incentives due to the regulation itself. The regulators purport to oversee and thus rein in the producers. What they often do in fact is distract management, motivate capture, incentivize cheating, divert energies, dilute liability, and diffuse responsibility. When something goes wrong, management can too easily slough off the responsibility on others: hey, we complied, so leave us alone.
It’s nothing unusual at all. It’s been going on for more than a century in a wide range of industries under regulatory control. The examples are countless but let’s revisit the nation’s first food regulations concerning meat packing passed in 1906. After a series of non-scandals that were mostly mythology, the US government sent its regulators into plants with a new method for determining whether meat was fit for consumption.
The legislation required federal inspectors to be on-site at all hours in every meat-packing plant. At the time, regulators came up with a pathetic method for detecting bad meat, namely poking a rod into the meat and smelling the rod. If it came out smelling clean, they would poke the same rod into the next piece of meat and smell it again. They would do this throughout the entire plant.
As Baylen J. Linnekin points out in “The Food-Safety Fallacy: More Regulation Doesn’t Necessarily Make Food Safer” (Northeastern University Law Journal, vol. 4, no. 1), this method was fundamentally flawed. You can’t necessarily detect pathogens in meat by smell. It takes a long time for bacteria to begin to stink. In the meantime, bacteria can spread disease through touch. The rod could pick up bacteria and transmit it from one piece of meat to another, and there was no way for inspectors to know about it. This method of testing meat most certainly spread any pathogens from bad meat to good meat, assuring that an entire plant became a house of pathogens rather than having them restricted to just one carcass.
The author writes:
USDA inspectors undoubtedly transmitted harmful bacteria from one contaminated piece of meat to other uncontaminated pieces in untold quantities and, consequently, were directly responsible for sickening untold numbers of Americans by their actions.
Poke-and-sniff — incredibly a centerpiece of the USDA’s meat inspection program until the late 1990s — was, in terms of its sheer efficiency at transmitting pathogens from infected meat to clean meat, nearly the ideal device. Add to this the fact that the USDA’s own inspectors were critical of the inspection regime from the start, and that the USDA abdicated its inspection role at hundreds of meat processors for nearly three decades, and it becomes quite apparent that instead of making food safer, poke-and-sniff made food and consumers less safe.
It was true back then and true today that regulations that purport to add oversight end up having the opposite effect.
To understand how this works, think about this by analogy to the following hypothetical.
Let’s say that a number of households show incidents of disease emanating from the kitchen of US homes. Congress passes legislation to force the Federal Drug Administration to directly inspect every home to make sure that no unwrapped food stays on counters, that all knives are free of food particles, and that all food in refrigerators is in sealed containers. American families – newly terrified of disease but now also scared of regulators — fly into panic to make sure these three things are done lest they face criminal penalties. This generates the same false confidence we have in government-approved drugs: we let down our guard with the presumption that compliance guarantees quality. Meanwhile, other things such as old food in the fridge are neglected. It would not be entirely surprising to see this attempt to fix the problem result in the opposite.
That’s just the household case. The industrial case is far worse because industry has every incentive to make sure they gain control of the regulatory structures themselves, and expend vast resources to lobby, influence, investigate, comply, and maintain a buddy/buddy relationship with their overlords. This is just a cost of doing business.
The normal business of business means accepting responsibility for the effectiveness and safety of their products and services. But accepting regulatory overlords can mean gaming the system while passing that responsibility onto others.
This appears to be the underlying dynamic at work in this case of the Boeing 737 MAX. Boeing was playing fast and loose with compliance and thinking they got away with some major cost savings, until their mistake ended in catastrophe. It is very possible that a more deregulated environment – one in which private governance takes a larger role and there is clarity about roles and expectations – would have put the onus on control, responsibility, and liability right where it belongs. It’s likely that the new plane would have been clearly labeled as such, and pilots would have themselves demanded retraining. But passing off the new plane as nothing more than an upgrade misled pilots themselves, who have said that they didn’t even know about the software changes.
While the response to this might be to place all blame with the company, and call for ever more power to the bureaucrats who are in charge of regulations, the result of this change could be further to misalign incentives.
If we want private businesses, and the consumers whom they serve, to be wholly liable for the safety and effectiveness of their products – and absolutely we do – the right response is to let businesses bear full responsibility, and do so without the confusion and static introduced by coercive third-party bureaucracies. Such agencies know less about the industries they control than do the private engineers themselves, and have far less of an incentive to do a good and thorough job than private industry.
When government muddies the waters of responsibility for guaranteeing quality and safety, the result is a false confidence and the acceptance of unwarranted risks that make problems worse not better.