This article originally appeared in IMPO's April 2015 print issue.
We have all experienced it when in a car. The voice from the back that says, “You are following too close,” “You are driving too fast” or “You just went through a red light.” It’s the backseat driver. Backseat drivers are quick to point out when the driver falls short in obeying the rules of the road. They observe the driver’s actions and compare them to the set of rules they learned of what is proper and acceptable for a similar situation. When they observe a negative deviation, they immediately bring it to the driver’s attention. Their intention is to point out the driver’s error so the driver will remember the correction when a similar situation happens again. These “backseat” corrections lag behind the thought process used to determine what actions to take.
In his 1931 book, “Industrial Accident Prevention: A Scientific Approach,” Herbert William Heinrich claimed 88 percent of accidents are caused by “unsafe acts of persons”. Mr. Heinrich also described what is often referred to as Heinrich’s accident pyramid (See Figure 1), which promoted an approach to safety improvement that by reducing the occurrences of incidents the organization would reduce major injuries. Heinrich’s theories, though often challenged, have provided momentum for improved work place safety through employee protection and hazard reduction. But, it hasn’t been enough.
To improve safety, many businesses adopt improvement programs that establish a partnership between management and employees to continually focus people's attentions on safety behaviors. These programs encourage safe behavior through structured processes of reporting and investigations. Although peer pressure can be a powerful motivator in the workplace, it may not be sufficient to achieve more than incremental improvements.
Have safety efforts gravitated into the back seat driver technique being applied in your workplace? When an employee’s actions are observed as not in accordance with the safety rules and procedures, their error is brought to their attention. The workplace reality is that this error often goes unaddressed, or even unobserved, until a safety incident occurs. The safety incident becomes the trigger for an investigation. The investigation determines what actions led up to the safety incident and often stop at the point of identifying the human error. The resulting corrective measures attempt to contain the human error by revising policies, enhancing procedures, retraining employees, punishing offenders, or some combination thereof. Such corrective measures lag behind the worker’s thought process.
So, how do we change from these reactive patterns and step ahead of those thought processes that introduce human errors? The answer may lie within your maintenance department. Yes, the maintenance guys who keep things running. Maintenance efforts have evolved from repairing equipment failures, to detecting when failures are developing, to ensuring failures don’t even happen. How have they done this? Many have adopted the philosophy and methods of reliability. Reliability goes beyond achieving dependable equipment. It becomes a culture imbedded in the organization that strives towards the minimization of human errors. In true reliability efforts, the identification of the human error becomes the starting point for the incident investigation. True reliability seeks to find those deeper causes, termed latent roots, which influence the thought process that resulted in the human error.
If we view employee injuries, as well as safety incidents, and near misses as equivalent to equipment breakdowns, it would make sense to apply those same reliability methods to safety. In following this method, safety incident investigators would ask “Why” questions and not just the “Who” ”What” “Where” and “How” questions. Reliability seeks to dig beneath the decision-maker and find out why their assessment of the situation and their resulting decision for action made sense to them at the time (See Figure 2).
More than likely they did not intend or even foresee the undesired outcome that occurred. However, if it was intentional prompt disciplinary action is necessary. Why should the focus be on their behavior as the cause when their situation assessment was actually influenced by the systems within their work environment? It’s equivalent to examining individual trees to determine why each one is dying while failing to comprehend that the entire forest is dying and working towards protecting all the trees. In applying the reliability method of root cause analysis to a safety incident, the investigation seeks to find the influences that made the employee think their action was acceptable. In other words, why did their assessment of the situation make sense at the time they made their decision for action? Investigating in this manner leads to the discovery of the latent root causes and potentially an understanding of the effect they have on the entire organization.
When the latent root causes are identified, corrective actions shift from punishments, retraining, or policy changes to cultural or system changes that remove the influences promoting the undesired or unsafe actions. Here is where reliability has the greatest impact. Like the nutrients taken in by the tree’s roots, the influences of cultures and systems affect the whole organization, not just one facility, department or individual. Any changes as a result of finding latent root causes have an impact far beyond correcting or preventing the initial incident. As the changes take hold, many future potentially unrelated, incidents are also eliminated. The organization as a whole reaps benefit.
Therefore, consider the reliability approach to human error reduction and view safety incidents not as the fault of individuals but as the result of impaired systems and the effect they have on the decision making of the human beings working within. Then strive to cultivate a work environment void of those influences that impact effective human decision making. For it is through effective decision making that people foresee the consequences of their decisions and thereby contribute to exceeding the safety goals of the organization.