Create a free Manufacturing.net account to continue

Systematic Failures Of Leadership

Of all inputs for rational decision-making, profit versus risk has the greatest propensity to entangle us, and it is the ability to balance that equation that makes for real leadership. For some time I have been interested in the whole notion of risk. Subsumed in the concept of risk is the idea that decisions, and the behaviors that result from those decisions, have positive and negative consequences.

Of all inputs for rational decision-making, profit versus risk has the greatest propensity to entangle us, and it is the ability to balance that equation that makes for real leadership.

For some time I have been interested in the whole notion of risk. Subsumed in the concept of risk is the idea that decisions, and the behaviors that result from those decisions, have positive and negative consequences. Virtually every decision we make is a hypothesis about the future, a bet; and we place thousands of internal wagers every single day, large and small. When we decide to drive 80 mph in a 70 mph zone, for example, we’re betting that the cops are not over the rise waiting to give us a ticket; and when we make a product design decision, we’re betting our design will accomplish the goal we have in mind. We make our decisions based on a hypothesis about how we believe something will turn out.

In fact, we are wired to operate this way. Our very perceptual system is based on such a probabilistic notion, it is a constructive process. When you look at something, your brain is constructing your world not only as you see it but also as you expect to see it. Your brain takes what you have experienced, adds what you wish to accomplish, combines it with what is perceived, and creates a behavior based on the highest probability of accomplishing your expressed goals. When goals are objective and not emotionally focused, such as an engineering design or a mechanical problem, decision making is often confused by other not-so-objective contaminates like laws, personal morality, money, pride, ego, and the like. A lack of understanding these contaminates causes the big problems, the catastrophic failures that we all pay for, some more dearly than others.

Every decision we make, from turning left at a traffic light to choosing a bolt for a pressure vessel, is a hypothesis. It is an experiment based on a bet. Occasionally, our bets are wrong, and these are called failures.

Recently, I attended a Deep Offshore Technology show in New Orleans and listened to a lot of interesting presentations on the state of the industry. Some presented technical papers, but the bulk of sessions at that conference dealt with the issue of risk.

As on-shore oil becomes more difficult to find and produce, we are increasingly turning to deeper and deeper waters to feed our energy needs. As you can imagine, what is hard to do on-shore becomes inordinately more difficult under 1,000 feet of salt water from a floating platform in the middle the North Sea. The people at this conference met this challenge with considerable engineering wisdom derived from years of experience, carefully thought out processes, and piles of money. As we all know, the potential profits in the offshore environment are enormous, but the consequences of failure are monumental.

Stan Bond, a professional engineer and Vice President at the Hess Corporation opened the conference with a thought-provoking summation regarding where we are in understanding failure and what we can do to prevent it. What follows is my limited understanding of this, not Stan’s; so please don’t string him up if you think I’ve got it wrong.

Engineering is fundamentally involved with the failure of machine and systems. That’s what we do. We make gadgets that work, and when they don’t work we figure out why, and fix them. Shortly after the creation of the first wheel and its inevitable failure, our early engineering forefathers looked for ways to make wheels last longer. Thus was born the concept of failure mode analysis, and we eventually began to create reliable and predictable machines for an industrial revolution. This is the machine phase of failure analysis.

Later, perhaps after the advent of wrongful injury lawsuits, some insightful soul watched an operator lop off his hand in a machine, and the science of human factors was born. We moved from an age of “the machine as a system that can be analyzed and its failures prevented,” to one in which we realized that machines in and of themselves were no longer the root cause of the failures; it was often the nut behind the wheel that caused the failure.

Soon after that, we began to focus on the issue of institutional safety. Management and OSHA, started creating and enforcing safety operations, and punishing unsafe behaviors (as if lopping off a hand wasn’t punishment enough); until finally we began to understand that safety had to be internalized in the psyche of each individual, not just advocated with a poster on a break room wall. While management could reinforce safety lessons, and OSHA could punish unsafe behaviors, it was human perception and cognitive processes that produced decision making, and actionable behavior as well as individual safety had to be ingrained.

Here we are today, with a system that mostly works and has been worth every penny it has cost, in ways seen and unseen. Safety is “in” and failures are on the run -- and yet, stuff still happens.

As Bond explained it, the final step in our march to zero failure is systemic failure prevention. This is the toughest, most fraught with uncertainty, and most allusive step. My take is that systemic failures stem from places in the universe where we have little control. They come from fuzzy societal factors like loopholes in laws, lax regulations, greed, and pride; which allow building for a magnitude 8.0 earthquake and being hit by a 9.0 and an ensuing tsunami. They stem from political factors, like launching a shuttle in unfavorable conditions so as not to disappoint a President, or cultural factors like not upgrading nuclear reactors for financial reasons and then denying catastrophe to save face.

The BP Macondo disaster was a combination of lax regulation, poor decisions, and subsequent risky behaviors that had not been punished by failure in the past. The good thing about these kinds of systemic failures is that they are rare; the bad news is they are usually catastrophic.

It is reasonable to believe that the common root of failure is a failure of leadership. After all, leaders are supposed to set boundaries and expectations. Good leaders don’t tell people what to do; they breed values, by their words, their deeds, and their clear expectations, into those who work under and around them. I have yet to meet a leader who can clearly inculcate values that will, with total certainty, anticipate all of the vagaries and complexities of life. Even if he or she could, the best of leaders are subject to the same contaminants as the rest of us, especially that most complex of contaminants, profit versus risk.

Perhaps that is the lesson here: Of all inputs for rational decision-making, profit versus risk has the greatest propensity to entangle us, and it is the ability to balance that equation that makes for real leadership.

Mike Rainone is the co-founder of PCDworks, a technology development firm specializing in breakthrough product innovation. Contact him at [email protected] and visit www.pcdworks.com.

More