Emergency Preparedness

Look Out for Black Swans … and Swiss Cheese

So-called “black swan” events are those with low probability but high consequence, and they pose a unique challenge for security, emergency, and safety professionals. See what two experts had to say about “swiss cheese” systems that can precipitate such an event and how the holes in these systems can be patched.

MHJ / DigitalVision Vectors / Getty Images

At Safety 2017, the annual professional development conference and exposition of the American Society of Safety Engineers (ASSE), Debby Shewitz, CSP, and Carol J. Robinson, CSP, CIH, spoke to a room full of professionals about black swan events in a session titled “Understanding Low Probability, High Consequence Events.”

Black swan events are those that are unexpected and have few comparable historic occurrences, Robinson explained. But when these incidents do occur, the event can have severe, even catastrophic, outcomes—in short, they’re low probability, high consequence. Impacts of these events include serious life-altering injuries or death; economic impacts; and other factors that strongly affect the surrounding community. Some examples include:

  • The 2004 tsunami in the Indian Ocean
  • Hurricane Katrina in 2005
  • The Fukushima nuclear incident in 2011

The tricky thing about black swan events is that standard environment, health, and safety (EHS) metrics don’t apply, said Robinson. Companies with great safety or security records could potentially be affected by these events. But outside of the realm of natural disasters, these events can be impacted or partially controlled.

Robinson shared one example of such an event that she had a professional connection to—a resin plant explosion in Chicago. How could this event have been prevented?

Theories of Accident Causation

Robinson listed several theories that seek to explain accidents.

  • The Domino Theory, first proposed by H.W. Heimlich;
  • The Human Factors Theory, which suggests the chain of events leading to an accident is caused by human error;
  • System Theory, which postulates that accidents arise from the interaction between humans, machines, and the environment; and
  • Energy Release Theory, developed by Dr. William Haddon Jr., which suggests that mitigating the risk of accidents can be accomplished through controlling energy by placing barriers between the energy source and a vulnerable target.

However, Robinson focused her attention on the Cumulative Acts Theory, also known as the “Swiss Cheese Effect,” wherein no single issue is enough to cause a major event.

Where Are the Holes in Your Systems?

In theory, your organization’s safety systems are solid barriers between danger and workers. However, the Swiss Cheese Effect acknowledges that, in reality, systems of defenses have holes in the various protections standing between hazards and the potential losses of people and assets.

These holes in successive barriers aren’t usually lined up, however, which means that despite the imperfections in the system, the likelihood of an incident is low. But when more and more holes appear in the different systems, there is the chance that a series of them will align across multiple barriers, leading to a major event.

As an example, Robinson returned to the resin plant explosion, which was caused by an exotherm that reached an ignition source. Several factors aligned in order to cause this incident:

  • A winter storm dropped a large amount of snow, preventing workers from reaching the plant.
  • A temperature control device had burned out, but no one knew it was not functioning.
  • The reactor was not equipped with a quenching system or a foam over tank.

These “holes” in different systems and facility designs aligned one night to create a black swan event—in this case, an explosion.

In many cases, these events can be identified and mitigated through risk assessment, process hazard analysis, bow tie analysis, additional barriers, and a strong control strategy. So why aren’t these addressed more often? There are two typical reasons, said Robinson:

  1. The perception of risk is inaccurate.
  2. Cost, cost, cost—budget constraints are often an issue.

Always Be Learning

Shewitz said that despite the inherent unpredictability of black swan events, there are specific controls that EHS professionals can hope to implement in order to have an influence on them.

First, learn from high-hazard industries and professions such as oil and gas, the space program, and the nuclear industry. Learn from both regulatory requirements affecting these field as well as voluntary programs that have demonstrated success. Take a high-level approach when looking at these industries; what are their management systems and how do they develop a strong safety culture?

From here, determine what makes sense for your own individual operations, said Shewitz. Build on your existing systems and customize what you’ve learned to your culture—go after the low-hanging fruit and then expand to the bigger issues you see.

Focus on follow-up from your existing information sources, including:

  • Maintenance records;
  • Inspection/audit programs; and
  • Accident/incident investigations.

Safety committees can (and should) help with the data analysis resulting from these sources, said Shewitz.

Then, you may want to expand your sources of information to include:

  • Suggestion programs;
  • Confidential or anonymous reporting systems; or
  • Near-miss reporting.

These are things that an individual EHS professionals should be able to do mostly by themselves, with relatively few resources and not much need for executive approval, said Shewitz.

Other Steps for Safety and Security Pros

Shewitz also presented a number of other steps that safety professionals can take in order to decrease the likelihood of a black swan event.

If you have dangerous equipment at your facility, implement preventive maintenance (PM) programs with a focus on:

  • Mechanical integrity;
  • Manufacturer’s recommendations; and
  • Codes and standards.

If your organization has the time to put these systems in, said Shewitz, then there’s time to keep them running right.

Also, it’s a good idea to have some sort of change control system that:

  • Addresses the management of change;
  • Is appropriate to the complexity of your systems; and
  • Is implemented as early as possible in the planning stages of an initiative. Shewitz noted that if you wait until equipment is ordered or installed, it becomes pretty hard to change things.

Next, know your industry’s best practices. Good sources of information include:

  • Trade associations;
  • Local safety councils, chambers of commerce, or ASSE chapters; and
  • The Chemical Safety Board, which is nonregulatory, but has great material on these sort of high consequence events and preventive measures.

Finally, keep an eye on your organization’s safety culture. Shewitz noted that more than 60% of workers admit to having seen coworkers violate safety precautions or take unsafe shortcuts—and too many of them are unwilling or unable to speak up. In order to address this and improve the situation, your reporting culture must be based on trust, empowerment, and authority. Shewitz acknowledged that it’s very hard to get management approval to provide blanket stop-work authority to workers, but ultimate work authority (a clearly defined position that can make the call on stopping work onsite) may be easier to implement.

Dealing with Pushback

There may be pushback from management on some of these initiatives—remember, these events are “low probability,” and cost and budget are always on the minds of managers. Here are a few strategies Shewitz suggests for dealing with pushback:

  • Have facts at the ready (e.g., “See, this has happened before”) and use them to persuade the management team.
  • Have a risk assessment on hand.
  • Provide solutions that are as simple as possible.
  • Find added benefits to these solutions, such as how they can help the bottom line.
  • Line up support for your initiatives before you pitch them.

Black swan events are unpredictable, and the reality is that all safety systems are bound to have a few holes in them. However, with proper vigilance, an inquisitive outlook, and appropriate action, safety professionals can help prevent their organizations from falling victim to catastrophe.