How to spot major incidents and Black Swans in advance

Once detected, rare events are easy to see. The problem in health and safety management is to see them in advance.
Once detected, rare events are easy to see. The problem in health and safety management is to see them in advance.

Major incidents are as rare as Black Swans. You have not seen one on site for many years. But they are not extinct, not even endangered.

We spend a lot of health and safety management time and budget on managing minor incidents, or Grey Geese. But as all birds keep an eye out for eagles, so we should keep an eye out for Black Swans.

Common sense says a major incident, the kind that could close down our organisation, is impossible. Before 1700, all swans were thought to be white.

Dutch explorer captain Willem de Vlamingh turned up off the coast of Terra Australia in 1697 and still there was no record of a Black Swan. But Willem de Vlamingh took a wrong turn on the western coast of Australia.

What he saw, overturned thousands of years of belief. Willem saw many hundreds of them. He notes that he could not find a white swan. Australian aborigines had a saying; “Rare as a white swan”.

The main river in Perth, capital of Western Australia, is Swan River. The premium beer in this part of the world is Black Swan.

The analogy goes to the heart of our understanding around our environment, our method of thinking (inductive reasoning), our understanding of risk (or lack of it), our experience, and behaviour. These cognitive issues directly apply to the safety equation.

I would like to write more on the philosophy of science, and how that feeds directly into our inductive thought processes regarding statistics, safety, and risk, but readers who want to follow that up could read Karl Popper and Paul Feyerabend, both already published decades ago, yet they are as relevant today as when they published papers on science versus common sense, and testability.

A Black Swan Event (BSE) is largely unprecedented, and unexpected. After unusual events, common sense, and the ordinary Joe, will say “it was bound to happen”. Hindsight reveals what we did not want to see today.

Similar events have happened previously. We just haven’t noticed, or filed them under “Never happen again”, or “Won’t happen to us” files.

We all make a living from status quo thinking, until things change, and our thinking changes with it. Our job in health and safety management is to think like Galileo or Copernicus: against common sense, and trusting our instruments.

After Black Swans were discovered, it seemed obvious that they had existed all the time, just as other animals come in varying colours. Observations about other animals already inform us about variability and mutation, but we trusted experience.

Experience is valid only if all factors (materials, processes, equipment, training, behaviour, hazards, risks) remain static, but they never do.

The theory of behaviour based on Black Swan Events was introduced by Nassim Taleb, in his book The Black Swan: The impact of the highly improbable ”. This is arguably one of the most important books to have been published in the last 50 years, for making the science of Popper and others accessible to managers.

black swan book taleb cover

Taleb noted that a Black Swan Event had three properties:

Rarity : The black swan is a rare event. It lies outside the realm of common experience. A black swan is that million-to-one chance that statisticians said would never happen because it was a million-to-one chance. Yet, million-to-one chances happen 9 times out of 10.

Extreme impact:  It could be an explosion, or hijacking of some aircraft to fly into buildings. Or a somewhat middle-aged cyclist (myself) being struck by a car travelling at over 80 km/h. A single string of events could dominate over all other factors.

Retrospective predictability: We were blind to the possibility, and in denial (there may be Black Swans, but not here). We believe that we can see things coming. We have a narrative fallacy; we construct a sensible story using only pertinent information, discarding information that were not useful in the end. We manage as if we were doing an incident investigation, forgetting that an investigation is all about only one possible event. Major events leave us even more vulnerable to risks, because we become fixated with the few things that did happen.

One of Taleb’s best stories about Black Swan Events is that of turkeys that are lovingly raised for Christmas dinner. The turkey does not know how it would end.

Our natural psychological state prevents us from recognising the signs that a Black Swan is eyeing our site. We accumulate supporting information, such as incident rates. The moral of the story; do not be a turkey.

Good and Bad Swans

Taleb also writes on “good” and “bad” rare events. Good Black Swan Events could be the personal computer (IBM had no advance idea about that).

Another might be considered the Internet. Subject experts were still saying it would amount to nothing as recently as the 1980’s. Good and bad is a matter of perception. But note that unexpected events do occur all the time.

The First World War was a Black Swan Event. The two key personal protagonists were first cousins. The Second World War was another. Neville Chamberlain said “peace in our time”. Less than a year later the world was at war, ending with a new kind of bomb, and the horrors of Hiroshima and Nagasaki in Japan.

Small incident, one fatality

We tend to think “big”, but Black Swan events could be small. Safety officer Owen Scholes called me from in tears; “Dave we need you, we’ve just killed someone”. Those words are hard to forget.

As a trauma psychologist I immediately travelled to the scene, and later at the depot was involved in psychologically treating his workmates. This disaster involved only one fatality.

The impact on the organisation was significant. The impact on Owen was beyond horrific. The wife was eight months pregnant when her husband died.

I measure disasters also by the qualitative impact on individual, and on the organisation. There is much more than insurance money at stake.

Where is the Black Swan? They are invisible until they appear in statistics, and by then it is too late.
Where is the Black Swan? They are invisible until they appear in statistics, and by then it is too late.

Where is the risk?

One of the key questions about Black Swan thinking is whether the traditional management understanding about risk have any role to play.

My own view is that “risk” is an over-rated term that is actually very poorly understood.

How many major accidents (quantitative or qualitative) have occurred as a consequence of there not having been a risk assessment completed? The answer is very, very few. Incidents seem to bypass our metrics and systems and interventions and measures.

Some mangers always say, and I get sick of hearing it, that ‘the risk assessment was obviously flawed’. Does this sound familiar? That’s about as smart as a Christmas turkey.

Black Swan thinking says that what we don’t know, is way more relevant than what we do know. We should be out looking for gaps in our data.

Incidents live at the limits of our understanding about likelihoods, probabilities, and risk.

It can be a consequence of our unwillingness to consider options from other sources, that we do not manage.

Risk is also a product of distortions of what we do know. Many of us would be surprised at how easily we can have our “knowledge” distorted – both deliberately and subconsciously.

The biggest mistake we make in safety (and there are many big ones) is prediction based on what we think we know (the Heinrich /Bird triangle of ‘predictability’ of incidents based on the scale of their consequences, is one of those).

We get this wrong way more than we get it right. When something bad does not happen, we accept that as validation of our prediction . We seem to have been hard wired to be predictive and judgemental.

We do need a sense of predictability to our lives and organisations. We get this sense of predictability from inductive reasoning. The problem here is that we can’t live without inductive reasoning, nor live safely with it either.

There are a number of traps in our reasoning. One trap is to shut off the nagging little voice of alarm. We live with many risks, and override most of them.

Black Swan Events advise us to invest less in our ability to predict. No matter how good we are at predicting (or think we are), major incidents will still happen, and they are not all Acts of God.

Remember that massive consequences does not necessarily mean the number of dead people. Consequences are relative. One injured person may well have massive consequence if it is your relative.

We have to spend more time hunting for invisible Black Swans:

[] Discard the obsession with risk assessment, statistics, and prediction.

[] Examine the robustness of the knowledge we think we have.

[] Find errors in our behaviour, and change that behaviour.

[] Expand our managerial toolbox.

Keep health and safe, and think about what ‘risk management’ means.

  • David G Broadbent is a safety psychologist, and founder of TransformationalSafety.Com. This post is an edited extract from A Second on Safety.
Related Posts Plugin for WordPress, Blogger...

The following two tabs change content below.
The Creator of the Worlds foremost Safety Culture Improvement Systems.

3 thoughts on “How to spot major incidents and Black Swans in advance

  1. We have been scripted to expend more time and effort on managing minor incidents, or Grey Geese. The analogy of the Black Swan challenges us to write a new script. For many years we have been pathologically obsessed with risk assessment, statistics and predictions. It is time we heeded David’s advice in spending more time hunting for the invisible Black Swans.
    For example, the Titanic disaster was a perfect example of an invisible Black Swan.

  2. Professor Nassim Nicholas Taleb, writer and former Wall Street Trader coined and popularised the definition of a “Black Swan” event. The UK exit from the EU was a “Black Swan” event. Safety professionals should learn from other disciplines in order enhance or deepen their knowledge. Briefly put, cognitive flexibility is seriously required to our field! The editor of The Business Times Ron Derby wrote that the Brexit was “The biggest iceberg we’ve hit this century” (Sunday Times July 3 2016). He linked the Brexit to the Titanic disaster. According to the World Economic Forum (WEF) the following skills are needed in order for one to thrive in the fourth industrial revolution:
    -Complex problem-solving.
    -Critical thinking.
    -People management.
    -Co-ordination with others
    -Judgement and decision making
    -Service Orientation.
    -Negotiation.
    -Cognitive flexibility.

Comments are closed.