Episode #022- Safety, defined.

Underneath every simple, obvious story about ‘human error,’ there is a deeper, more complex story about the organization.

-Sydney Dekker from The Field Guide to Understanding ‘Human Error’

After my recent series on former Alcoa CEO Paul O’Neill, Listeners have commented that safety does not have a place in their workspace. Office environments, banks, retail, etc.

That is not true.

First and foremost, let’s define safety. Make sure we are all on the same page.

The only tasks that should have some form of structured process attached to them are the tasks that when performed, an unintended consequence has the potential to land workers in the hospital or the morgue.

Now, for those organizations where physical injury is not a significant threat, Here is the definition of safety: Tasks or processes that when performed by a worker, an unintended consequence has the potential to land the organization in the hospital or the morgue.

Let me detail this a little. If a task when being performed has the opportunity to cause significant injury or fatality (SIF) of either worker or organization. Then it needs to have processes in place that give the worker the tools and resources they need to mitigate the hazards even if an unintended event occurs.

Your workers don’t show up to work to do a bad job, get hurt, or worse.

Caveat: In this day and age, there may be bad actors, terrorists, or some other, but these are all criminal in intent and have to be dealt with in the hiring phase, that is another future podcast topic.

Your workers show up to perform a day’s work, get paid, and go home to their families, friends, and loved ones. They do not want to get hurt or hurt a coworker.

When an unintended outcome occurs, we must take the psychologically proven “local rationality principle” into account. This is based on a lot of research on cognitive science. The local rationality principle says that what people do makes sense to them at the time. Given their goals, attention to focus, and knowledge. Otherwise, they would not be doing it. Pilots do not check-in for a flight in order to die. Nurses do not sign in to go kill a patient. Accountants do not use the wrong formula to destroy the annual financials.

The local rationality principle is important. If people did things that seem, at first, inexplicable, it is not because they were doing inexplicable things. Jens Rasmussen, one of the originators of what is known as the “New View” safety investigation, suggested that it is because we (bosses, leaders, and incident investigators) are not positioning ourselves to understand why it made sense for them to do what they did. That burden is on us as leaders to understand the true nature of work performed not as we perceive it. We need to put ourselves in the worker’s shoes and understand why it made sense to them to do what they did. (Dekker, 2014)

If we don’t wait until an incident occurs but rather get out there NOW and understand the true nature of the work, we create goals (not rules) specific to assisting the worker to follow a path and watch for potential hazards.

These goals and the path to safe work are specific to the pieces of the task that have the greatest impact on potential unintended outcomes. These are not bullet-point how-to lists.

No one likes to be micro-managed. Poor to outright bad managers, you know the ones, the incompetent, or just plain stupid folks. (If not, go listen to episodes 6 and 21.) follow a general path. John Maxwell, in his famous leadership book/process, The 5 Levels of Leadership. Maxwell calls these folks, Level I leaders, and says this about these folks:

They rely on their “Position/title” to push people. Here is how they think, I quote:

  • Top-down- “I’m over you.”
  • Separation- “Don’t let people get close to you.”
  • Image- “Fake it ‘til you make it.”
  • Strength- “Never let ‘em see you sweat.”
  • Selfishness- “You’re here to help me.”
  • Power- “I determine your future.”
  • Intimidation- “Do this or else!”
  • Rules- “The manual says…”

How to fix the first seven of those ways poor managers think are for other podcast episodes. But the last way of thinking, RULES Thinking is what we get to mess with here.

We know from past episodes, whether it be the Peter Principle or the Dunning-Kruger Effect, that there are a lot of people out there in leadership positions that are just plain lost in the role. Because they cannot comprehend how to decide or need to keep those around them in the dark to protect their ignorance, these folks default to “by-the-book leadership.” Following the rules, and enforcing them to the letter upon those they, quote, “lead”.

When we create task performance documentation that outlines safety goals and gives the worker the ability to reach the intended, safe, end then we allow the individual to use their knowledge and understanding of the specific task and conditions, to do great work. Do not squash them with rules and demean/infantilize them into oblivion.

Side Note: Do you have a group, unit, division, or facility that has exceptional worker turnover? If so, chances are very high that you have a Level 1 “leader” who is causing the worker discontent.  One of the super cool second–order effects of removing “rules” and implementing goals and worker autonomy is your Rule Following Level 1’s leader will stand out through their own failings. And because, like Paul O’Neill, you have created a culture where giving training to or firing an employee, no matter their “importance” their loss is accepted, understood, and most often celebrated.

I know this process may sound counter-intuitive to what you or your organization are doing now. Chances are really high that what you are doing now is “the way we have always done it.”

No innovation, improvement, or excellence comes from doing the same as everyone else or following a decades-old path.

Take a leap and do what is right for your people and your organization.

The ideas of Human Performance (HP) stem from the post-incident investigations of 1979’s Three-Mile Island nuclear power facility. It the Department of Energy almost 30 years but they eventually came up with the ideas and some processes of HP found through experimentation, psychology, and testing. Here are the 5 Principles of Human Performance (Conklin,2019):

  1. Error is normal. Even the best people make mistakes.
  2. Blame fixes nothing.
  3. Learning and improving is vital. Learning is deliberate.
  4. Context influences behavior. Systems drive outcomes.
  5. How you respond to failure matters. How leaders act and respond counts.

In the years the release of the DOE document, its data, and information have led to what is called “Safety Differently.”

This is a whole other series of future podcasts, but here is a quick glimpse:

  1. Safety is not defined by the absence of accidents but by the presence of capacity.
  2. Workers aren’t the problem; workers are the problem solvers.
  3. We don’t constrain workers in order to create safety, we ask workers what they need to do to work safely, reliably, and productively.
  4. Safety doesn’t prevent bad things from happening, safety ensures good things happen while workers do work in complex and adaptive work environments.

The current state of the safety process and incident investigation is as such.

I apologize in advance for getting passionate about this.  Uh, No I don’t!

The majority of safety programs, EH&S programs, and just about any other compliance program is what is called, “Behavior-Based.” These types of systems are founded on the assumptions that:

Workers, in this view, are a problem to control. People’s behavior is something to control, something that you must modify (emphasis mine). Leaders of these EHS programs believe you have to start with people’s attitudes because those influence their behavior. EHS tries to shape these attitudes with posters and campaigns and sanctions, which they hope will impact workers’ behavior and reduce their errors (even though there is no scientific evidence that any of this works). (Dekker, The Field Guide to Understanding ‘Human Error’, 2014)

Also described as:

Safety is the absence of negative events. A system is safe if there are no incidents or accidents. The purpose of safety management is to ensure that as little as possible goes wrong. The focus is on negative events and reducing their severity and number. This often translates into trying to reduce the variability and diversity of people’s behavior- to constrain them and get them to adhere to standards (emphasis mine). (Hollnagel, 2014)

In November 2018, I was on an EHS weekly conference call when the Senior VP (top guy) of the EHS department opened with:

“There have been a lot of slips, trips, falls, and turned ankles recently.  We have to control the worker’s behavior before this gets out of hand.”

Remember at the beginning of the podcast I called out those poor to outright bad managers, you know the ones, the incompetent, or just plain stupid folks. (If not, go listen to episodes 6 and 21.) This SVP I just quoted fills the bill in spades.

His ignorance and need to enforce his power put many a worker in the hospital.

This behavior in EHS departments is the norm. Some get hurt and the investigation into the event stops at “Human Error.”  This is by design, as part of the “Behavior-Based” safety mindset, EHS can remove blaming the organization as being at fault for the incident by saying the injured worker “made a mistake” or is “stupid” or “did not follow policy” or any number of things. But, if they were to truly follow a root cause analysis, they would find the cause of the incident is systematic. No individualistic.

This old view of safety, the Behavioral Based one, can also be called the “Bad Apple Theory of safety. It maintains:

  • Complex systems would be fine, were it not for the erratic behavior of some unreliable people (bad apples) in them.
  • ‘Human Errors’ cause accidents: more than two-thirds of them.
  • Failures come as unpleasant surprises. They are unexpected and do not belong in the system. Failures are introduced to the system through the inherent unreliability of people.

The Old View maintains that safety problems are the result of a few Bad Apples in an otherwise safe system. These Bad Apples don’t always follow the rules, they don’t always watch out carefully. They undermine the organized and engineered system that other people have put in place (People who have no experience with performing the task). This, according to some, creates safety problems.

“It is now generally acknowledged that human frailties lie behind the majority of accidents. Although many of these have been anticipated in safety rules, prescriptive procedures, and management treatises, people don’t always do what they are supposed to do. Some employees have negative attitudes to safety which adversely affects their behaviors. This undermines the system of multiple defenses that an organization constructs to prevent injury and incidents.”

This paragraph embodies all of the tenets of the Old View:

  • Human frailties lie behind the majority of accidents. ‘Human errors’ are the dominant cause of trouble.
  • Safety rules, prescriptive procedures, and management treatises are supposed to control erratic human behavior.
  • But this control is undercut by unreliable, unpredictable people who still don’t do what they are supposed to do.
  • Some Bad Apples have a negative attitude toward safety, which affects their behavior. So not attending to safety is a personal problem, a motivational one, and an issue of individual choice.
  • The basically safe system, of multiple defenses carefully constructed by the organization, is undermined by erratic or unreliable people.

So, in other words: we are so smart that we have fixed dangerous work if it weren’t for all these mean, stupid, or accident-prone people doing the work.

This view, the Old View, is limited in its usefulness. In fact, it can be deeply counterproductive. It has been tried for decades, across every industry, without noticeable effect. Safety improvement comes from abandoning the idea that errors are causes and that people are the major threat to otherwise safe systems. Progress in safety comes from embracing the New View (Dekker, 2014)

Get out on the sales floor, walk the cubicles, meander in the shop, or drive out in the field. Talk to your people, ask them questions, and let them tell you what the hazards are (this comes back to Episode 4, The Right Questions) and how to best avoid them. Let them know you are listening to them, using their skills and experiences to lead the company into the future. You will be amazed at how quickly acceptance of change occurs.

Well, I think we have done a good job of diving into the Safety, defined. In coming episodes, we will continue to dive deeper into these ideas, their reasoning, and the real-world observations and how we can ensure these processes’ effects drive us to excellence, and how the effect of well-managed safety affects our lives, businesses, and organizations.

Links to all the quoted resources are in the show notes and in the transcript on my website, Eddiekillian.com

Join me next Tuesday as we continue to travel the path of what is difficult, perilous, and uncertain as we explore introducing A New Order of Things.

I am your host, Eddie Killian. And this concludes Episode 22.

References

Conklin, T. (2019). The 5 Principles of Human Performance. Santa Fe: PreAccident Media.

Dekker, S. (2011). Drift Into Failure. Boca Raton: CRC Press.

Dekker, S. (2014). The Field Guide to Understanding ‘Human Error’. Boca Raton: CRC Press.

Department of Energy. (2009). Human Performance Improvement Handbook Volume 1: Concepts and Principles. Washington D.C.: Department of Energy.

Department of Energy. (2009). Human Performance Improvement Handbook Volume 2: Human Performance Tools for Individuals, Work Teams, and Management. Washington D.C.: Department of Energy.

Ehrlinger, J., Johnson, K., Banner, M., Dunning, D., & Kruger, J. (2008). Why the Unskilled Are Unaware: Further Explorations of (Absent) Self-Insight Among the Incompetent. National Institute of Health, 98-121.

Hayes, A. (2023, March 28). Dunning-Kruger Effect: Meaning and Examples in Finance. Retrieved from Investopedia.com: https://www.investopedia.com/dunning-kruger-effect-7368715

Hollnagel, E. (2014). Safety I and Safety II: The Past and Future of safety management. Farnham, UK: Ashgate.

Kruger, J., & Dunning, D. (1999). Unskilled and Unaware of It: How Difficulties in Recognizing One’s Own Incompetence Lead to Inflated Self-Assessments. Journal of Personality and Social Psychology, 77(6), 1121-1134.

Peter, L., & Hull, R. (1969). The Peter Principle. New York: Bantam Books.

Leave a comment