Last year, at a client’s chemical manufacturing plant, there was a large release of toxic organic solvent. The severity ultimately proved to be minor, but with small changes in the details could easily have been reportable for environmental reasons.
To make matters worse, the company blamed the wrong person.
We helped them reinvestigate with an eye on cultural factors. The results they got on their second go-round were profoundly different. That plant, and now the entire company, is changing in response.
First, let’s take a look at the surface facts. As is often the case, every step of the story and every decision along the way made sense at the time, following the “keep it simple, stupid” (KISS) approach. Even so, the outcome—a large chemical release—was problematic.
Two operators were working on a high-priority maintenance effort, expected to require multiple shifts to complete, in another part of the plant.
These operators received a radio call to go remove some Energy Isolation (EI) devices (locks and tags) from a pump and tank assembly. The pump had been removed for maintenance (after proper lock-out/tag-out procedures were placed) and then reinstalled. The two operators were given the task of removing the locks and tags, properly opening or closing the associated valves, and restarting the pump.
They did the work, signed it off, and then returned to their previous ongoing project.
The next day, a shift supervisor saw liquid pouring from an open drain valve on the pump that had been restarted by the two operators. He closed the valve. Clean-up was performed. An investigation was conducted.
The investigation concluded that the drain valve had been left open by mistake. One of the operators—the one who signed the paperwork asserting that the job had been done properly—was blamed.
Simple. The guy signed an official document saying he did something, but the truth was that he neither conducted the work himself nor verified it had been done.
The incident’s consequences were significant, but as a matter of pure luck weren’t dire. It seems as if one operator failed to follow simple instructions and didn’t care enough to double-check his work.
It’s easy to imagine what might have gone through the minds of management: Why can’t these guys be more careful? This work is important! Well, we have to hold the guy accountable so he’ll learn a lesson, and maybe other people will learn from his consequences. Meanwhile, let’s get this plant back up and running.
If that is indeed what they thought, it would certainly be understandable. Perfectly natural.
As I mentioned in the beginning, this company is a current client. They started by participating in a training module we offer on human factors and analysis in incident investigations. As a final project in that training, we have trainees reinvestigate one of their own incidents with the new tools we’ve taught them. They chose to reexamine this toxic release.
Human factors analysis really starts with a two-step mindset shift:
Allow for the possibility that the real story isn’t conveniently simple. Resist the temptation to simplify too soon. We don’t quite agree with Weike and Sutcliffe that a “reluctance to simplify” is comprehensively helpful, but it is very important in the early stages of building a new understanding.
Assume, until and if it proves impossible to do so, that the people involved were trying to do the right thing, and that whatever they did made sense to them at the time.
Slightly—and ironically—oversimplifying human factors analysis here, the process post-mindset shift essentially reduces to this: A thorough investigation into why the behaviors in the web of events did in fact make sense at the time.
Upon reinvestigation, this client realized the following cultural factors were at play.
Cultural values influenced the operators to be responsive, “can-do” team players. When they got the radio call to go remove EI from the pump assembly, they jumped on it.
Willingness to delegate authority is also one of their cultural values. However, this positive value, together with both a lack of oversight and management’s failure to communicate the larger context, conspired to create an operationally undesirable consequence: A long-term high-priority task was put on hold for a lower-priority short-term task.
Neither of the operators had the “lead,” because there is a cultural value of trusting co-workers. Operator 1, we’ll call him, took the lead on paper by signing the procedure as complete, and thereby suffered the personal consequences. But during the actual job, neither operator was really in charge, and as a result, gaps in responsibility and execution were missed.
Inexplicably disconnected from the realities of risk analysis, this organization’s culture implicitly communicated the following message: Removal of EI does not carry the same weight of responsibility, or receive the same level of vigilance, as placing EI in preparation for maintenance work.
In addition to these cultural drivers, which of course are neither the fault nor the responsibility of the operators, there were a number of other findings that pointed to leadership shortcomings.
There was no supervisory role with responsibility for prioritizing competing requests made of the operators—in this case, the job they were on vs. the job they were called away to do. It’s not that the supervisor failed to prioritize; the supervisor wasn’t expected to prioritize.
Operators are expected to sign off on work only verbally confirmed, sometimes by radio, performed by other operators. This leaves them predictably open for miscommunications and assumptions. Sooner or later, those avoidable missteps, with absolute certainty, result in harm.
Oversight for EI removal was routinely underemphasized and underperformed. Again, given the inevitability of human error, this is a setup for harm.
At the conclusion of their reinvestigation exercise, the trainees were able to sit around a table, look at each other, and admit, “Uh, oh. We blamed the wrong person.”
One piece of good news is that this organization now has a strong human factors investigation program.
Another is that these leaders understand that deep-seated cultural factors—always emerging from leadership behavior—have a profound impact on both safety and performance. The next step is to purposefully design leadership behavior at all levels of their hierarchy to create a culture that results in exceptional outcomes.
By the way, Operator 2, when he first heard of Operator 1’s situation, approached management and said, “If [Operator 1] gets consequences, then I should get them, too.”
That’s not the voice of a person who doesn’t take personal responsibility for his work. That’s gold.
But what impact did Operator 1’s tough conversation with management, the consequences that initially made so much sense to management, have on that man’s level of commitment? His engagement with his work? What about that of his co-workers? What happens after a pattern of similar injustice emerges?
That’s how mediocre leadership crushes employee gold. By failing to embrace the complexity of human interactions and how they can create unforeseen organizational outcomes, leaders can be the unwitting source of their own troubles.
Burl Amsbury is focused on ensuring the quality of Vetergy’s clients’ experiences and their ultimate success. He has served in the US Navy as a carrier-based attack pilot, mission commander, and maintenance quality assurance officer. In the private sector, he has been an executive, entrepreneur, and consultant for venture-backed high-growth companies in various industries. The common theme in his work has been an interest in complexity and high risk. He has served in product development, operations, marketing, and sales. Amsbury studied control system design at MIT, earning a BS and a MS. Post-Navy, he returned to MIT’s Sloan School of Management and completed an MS specializing in the design and management of complex systems. He lives in Boulder, Colorado, with his partner and her two young children, where he also acts as business manager for an innovative grass-fed cattle company. He may be reached at burl.amsbury@vetergy.com.