What happened to him engineering ethics?
We’ve seen only one disastrous news item after another in recent years, almost all of them knowable and avoidable. Planes falling from the sky. Nuclear power plants are melting. Foreign powers that accumulate in user data. Trite environmental testing. Electrical networks that burn states to the ground.
The patterns are not focused on discipline or nationality, nor do these events share an obvious social structure. Facebook machine learning programmers for the most part don’t hang out with German VW automotive engineers or Japanese nuclear plant designers. They did not teach in the same schools, nor did they share the same textbooks, nor did they read the same magazines.
Instead, there is a more fundamental thread that ties these disparate and heinous stories together: the increasingly harmful alchemy of complexity and capitalism. Only through a rejuvenation of the safety culture can we hope to repair the pair.
Unexpected disasters are really “normal accidents”
However, before we begin to assign blame, we must take a step back to look at these technical systems. Automotive emissions, nuclear power plants, airplanes, application platforms and power grids share one thing in common: they are very complex and highly coupled systems.
They are complex in the sense that they have many individual parts that are connected to each other in sometimes non-linear ways. They are highly coupled in the sense that disturbances to one component can lead to a rapid change in the entire operation of that system.
And so you get a reasonably small safety system on the 737 MAX that shoots down planes. And it has a reasonably limited API on a social platform that loses all stream of user data. And you have a power grid that interacts with trees that ignites and catches fire, killing dozens of people.
All these results are theoretically preventable, but then, the scale of the interactions in these systems is uncountable. Again, small changes can have huge effects.
Years ago, Charles Perrow wrote a splendid book relating the increasing complexity and combination of technical systems to the increase in catastrophic but normal accidents, which he used as the title of the book. His thesis was not that such disasters are rare and should be shocking, but that the very design of these systems guarantees that accidents must occur. No level of testing or system design can avoid a mistake between billions and billions of interactions. Therefore, we have normal accidents.
Write an account of the future of engineering, which may be too cynical. Engineers have combined some of this increasing complexity with more sophisticated tools, mostly derived from greater computing power and better modeling. But there are limits to the extent to which technical tools can help here, given our organizational behavior limits on the complexity of these systems.
Management security hoax
Even if engineers are (potentially) acquiring more sophisticated tools, management itself is definitely not.
Safety is a very slippery concept. No business leader is anti-security. None. All individual business leaders and managers in the world, at least, render a special service to the value of security. Construction sites can be a guarantee of danger, but they always have a “helmet required” sign on the front.
Safety may indeed be the first value of almost all of these organizations, but then, it can spend hours within a company’s 10-K or 10-Q before finding an iota of a statement on it (except, of course, after disaster strikes). ).
It is this intersection of capitalism and complexity where things have gone wrong.
One pattern that ties all of these engineering disasters together is that they all had whistleblowers who were aware of the danger ahead before it happened. Someone, somewhere, knew what was about to happen, and he couldn’t press the red button to stop the line.
And of course they couldn’t. That’s what happens when the pressure for quarterly earnings, for growth, can be so intense that no one in an organization has the ability, not even the CEO, to stop the system.
The strange thing is that these known disasters are not profitable for their creators. PG&E filed for bankruptcy. Facebook faces a billions of dollars fine. VW settled its scandal for $ 14.7 billion. The 737 MAX situation is leading to questions about whether Boeing can remain a going concern.
No shareholder wants to destroy worthless stock certificates. So where is the disconnect?
Rebuild an ethical foundation within the engineering culture.
Ethics begins with leadership at the top, and specifically with better communication about safety and regulatory concerns to all stakeholders, but definitely shareholders. Stock owners in companies with complex technical products need to be told, time and again, that the companies they own will prioritize safety over immediate earnings. The tone should always be to value growth and long-term sustainability.
For those who don’t frequent Wall Street watering holes, it may come as a surprise to learn that such a sales process can be difficult. Investors don’t like to hear that their return on equity will lose a few basis points, and would rather just buy a credit default swap and jump in when the ship literally and metaphorically sinks.
However, short-term traders are not the only investors available. Capital markets are diverse, and there are trillions of dollars of wealth managed by managers looking to invest in long-term growth, without the downsides of the inevitable disasters. A key part of investor relations is acquiring investors who match the culture of the company. If your investors don’t care about safety, no one else will.
The upshot of most of these scandals is that there is now a graveyard of companies that you are targeting, and that will help with these conversations.
However, beyond boardrooms and shareholders, engineering cultures need to build resilience to ship and approve products when they are ready. Engineering leaders need to speak to their business executives and explain safety concerns as much as they need to to constantly reinforce that safety is a priority for each individual taxpayer.
Engineering managers probably have the most challenging role, as they both need to sell up and down within an organization to maintain safety standards. The pattern I have gathered from reading many disaster reports over the years indicates that most security breaches start here. The engineering manager begins to prioritize the business concerns of his leadership over the safety of his own product. The resistance of these pecuniary impulses is not enough, security must be the watchword for everyone.
Ultimately, for individual collaborators and employees, the key is to always be vigilant, think about safety and security while engineering work is being done, and raise any concerns early and frequently. Safety requires tenacity. And if the organization you’re working for is corrupt enough, then frankly, it might be necessary for you to hit that proverbial red button and whistle to stop the madness.
Here at Extra Crunch, we are trying to do our part to increase awareness of these issues. Our resident humanist, Greg Epstein, interviews and discusses the challenging ethics of our modern technical world with all kinds of thinkers.
Take part of your work for inspiration, as the disappearance of the ethical engineer does not have to be a fait accompli. Nor do normal accidents, as normal as they are, have to be so common. We can repair capitalism by adding better tools and accountability for all levels of technical organizations. And in the long run, looking into that burgeoning corporate graveyard is an incredible investment for future returns.