Maria Montero

The disappearance and rebirth of the ethical engineer.

What happened to him engineering ethics?

We’ve seen only one disastrous news item after another in recent years, almost all of them knowable and avoidable. Planes falling from the sky. Nuclear power plants are melting. Foreign powers that accumulate in user data. Trite environmental testing. Electrical networks that burn states to the ground.

The patterns are not focused on discipline or nationality, nor do these events share an obvious social structure. Facebook machine learning programmers for the most part don’t hang out with German VW automotive engineers or Japanese nuclear plant designers. They did not teach in the same schools, nor did they share the same textbooks, nor did they read the same magazines.

Instead, there is a more fundamental thread that ties these disparate and heinous stories together: the increasingly harmful alchemy of complexity and capitalism. Only through a rejuvenation of the safety culture can we hope to repair the pair.

Unexpected disasters are really “normal accidents”

However, before we begin to assign blame, we must take a step back to look at these technical systems. Automotive emissions, nuclear power plants, airplanes, application platforms and power grids share one thing in common: they are very complex and highly coupled systems.

They are complex in the sense that they have many individual parts that are connected to each other in sometimes non-linear ways. They are highly coupled in the sense that disturbances to one component can lead to a rapid change in the entire operation of that system.

And so you get a reasonably small safety system on the 737 MAX that shoots down planes. And it has a reasonably limited API on a social platform that loses all stream of user data. And you have a power grid that interacts with trees that ignites and catches fire, killing dozens of people.

All these results are theoretically preventable, but then, the scale of the interactions in these systems is uncountable. Again, small changes can have huge effects.

Years ago, Charles Perrow wrote a splendid book relating the increasing complexity and combination of technical systems to the increase in catastrophic but normal accidents, which he used as the title of the book. His thesis was not that such disasters are rare and should be shocking, but that the very design of these systems guarantees that accidents must occur. No level of testing or system design can avoid a mistake between billions and billions of interactions. Therefore, we have normal accidents.

Write an account of the future of engineering, which may be too cynical. Engineers have combined some of this increasing complexity with more sophisticated tools, mostly derived from greater computing power and better modeling. But there are limits to the extent to which technical tools can help here, given our organizational behavior limits on the complexity of these systems.

Management security hoax

Markus Pfueller, Volkswagen’s lead lawyer, speaks to the press at the Stadthalle congress center for a trial before a trial court involving investors suing automaker Volkswagen AG for financial losses from the diesel emissions scandal on 10 September 2018 in Braunschweig, Germany. (Photo by Alexander Koerner / Getty Images)

Even if engineers are (potentially) acquiring more sophisticated tools, management itself is definitely not.

Safety is a very slippery concept. No business leader is anti-security. None. All individual business leaders and managers in the world, at least, render a special service to the value of security. Construction sites can be a guarantee of danger, but they always have a “helmet required” sign on the front.

Safety may indeed be the first value of almost all of these organizations, but then, it can spend hours within a company’s 10-K or 10-Q before finding an iota of a statement on it (except, of course, after disaster strikes). ).

It is this intersection of capitalism and complexity where things have gone wrong.