How to spot risk tolerance at work

David Broadbent unpacks the large role of risk tolerance in workplace health and safety.
David Broadbent unpacks the large role of risk tolerance in workplace health and safety.

Major incidents are rare, but they migrate to sites where managers and workers have a high risk tolerance, writes safety psychologist David Broadbent.

Incidents sneak up on us, and afterwards we think, “We should have seen that coming”. Incidents hide in the fog of risk tolerance, Only risk appreciation could spot them.

Behaviour is one of the most disquieting concepts in health and safety. It is just so natural to accept risk. In almost all things that we do there is some risk, and we learn to live with it. So how do we know how much risk is reasonably safe? We need to go cycling, watch the news, and visit a casino to find out.

I had never really given risk tolerance a great deal of thought until I was nearly killed in a cycling accident, and was loaded up with titanium, and screwed back together.

Some would say that cycling accidents are “rare”, given the number of kids riding to school and workers riding to work each day. That’s one of the problems with applying a quasi- statistical measure.

Now look at the consequences, or potential severity. That could change depending on the route, frequency, road condition, etc. How do we manage risks with this much variety?

Now here’s the rub. Almost every road cyclist I know, including myself, performs high risk manoeuvres a significant number of times on every ride we take. Why, when the possible outcomes are so dire?

Risk tolerance versus risk appreciation

The phrase risk tolerance is not about safety at all, but about high finance. That should ring some alarm bells already.

Health and safety managers do not talk about tolerance much. Tolerance is about how much uncertainly we can handle.

Risk out of sight must not be out of mind. As safety professionals, we function in a community that measures its success so much differently than the rest of the world. Our success is measured by the absence of events.

When something has NOT happened, then we are doing well. When so much “stuff” is not happening, we very quickly become tolerant of all those risks that we know are there, but are not happening.

After a while we may not even “know” they are there – we now just don’t notice. They have been absent from our experiential frame for so long that they are filed in the “rainy day” category.

Now here’s something a little scary! How do we know what is happening or not happening. Well yes, there is that thing about our personal experience.

We are severely restricted in our ability to make informed decision because of this question of knowledge.

I have lost count of the number of health professionals at John Hunter Hospital, who when advised what had happened to me, responded with “not another one”.

Does that place a whole new emphasis upon our risk matrix exercise earlier? It certainly did for me. We have to be careful though, we don’t want to “jump” the other way.

The staff at John Hunter Hospital also have a biased frame of reference; they see the banged up cyclists.

But “likelihood” has no relationship to “outcome”, our ability to make appropriate informed decisions about “likelihood” are significantly compromised.

I refer to this as “Inadvertent Ignorance”; make no mistake there is nothing deliberate about this! The consequences though I shall carry for all time.

Where does our knowledge about “likelihood” come from? I have already suggested a large measure comes from “personal experience”, but also from other sources.

Behavioural psychology literature teaches us of the impacts of immediate versus delayed rewards.

When we get home in one piece from work, we do NOT see the absence of a system failure as a reward. We do not even give a scintilla of thought to the “what might have been’s”.

Most incidents are not in the news

We also develop this “knowledge”, most unfortunately, from the media. Now that should scare us more than anything. Anybody who truly believes that the media provides unbiased reporting of news events etc has been living in a different world than I.

Whether a “story” even makes the news is an issue. There are so many factors that impact this outcome as well.

Recently a young man of twenty-eight was killed when he was buried alive in East Maitland, Australia. He was plastering the sides of a four metre hole which was to be used for horses to swim in. The hole collapsed and he died.

Of the four news networks that transmit into that area, only one even referenced this terrible fatality. If you had been watching the other three you had no idea. No knowledge!

Without knowledge there can be no “understanding”. Without “understanding” there can certainly be no “decision making” – optimal or otherwise.

I often reflect on the past issues around Swine Flu. How many days did it take to stop being the lead news story? Do you even remember? What about the Networks who showed images of people wearing face masks as a barrier to being infected by Swine Flu. We could argue the pros and cons of that approach I am sure.

My point is that much of that footage was a couple of years old. It was actually part of their “Bird Flu” archive.

This is not a case of not reporting. It is clearly  a case of fabricating aspects of the “story” to maximise the preferred impact of the piece. Our knowledge is clearly compromised.

Now a defence that some people shall put up, is that as we gain more experience, and buy in technical competence, our ability to make a more informed decision around these factors increases.

That argument makes every bit of sense. Indeed I travel the world based upon this quite reasonable belief. I am left pondering though; what type of risk based decision making occurred when the Egyptian government decided it no longer required a pork industry, and ordered the complete destruction of all the pigs in the land.

There was no evidence at all linking your common garden everyday porker with the transmission of Swine Flu. Despite that, a meat supplier in Sydney, during the same week, reported that they would normally deliver upwards of 80 dressed porkers per day. Their order book shrank to four!

We also then heard about H1N1 Influenza. Why, because the US state expressed concern that continued reference to that form of Influenza as “Swine Flu” would affect pork sales within the retail meat market – well we already knew that to be true.

So even the name of this thing was politically influenced. I make these points to demonstrate, I hope, that much of the “knowledge” that we access as key components of our decision making processes around risk, is potentially so compromised as to make any decisions emanating from those processes quite problematic; and prone to failure.

Even the peak body, the World Health Organisation, had muddied the waters significantly, and impacted our ability to make appropriate risk decisions about Swine Flu. A truly interesting read is the transcript of a Press Conference in 2009 by the WHO. In short it says that we had reached all the markers for Novel H1N1A (Swine Flu) to be escalated to a Phase 6 Pandemic (that’s as high as it goes). They then spend an eternity justifying why they have not raised the alert.

I spoke to a National Safety Manager last week who was at a loss to understand why a well regarded employee was seen to be performing work at height, wearing all the appropriate fall protection devices.

He just had not bothered fixing them to anything. The more we do something, the more tolerant we become of the potential hazards associated with that “something”.

Do not make the mistake of believing that the more you do a thing, the better you are at doing a thing. If you are very fortunate that might be true. It might be equally likely you have just not been bitten yet.

We are often prone to over-rate our competence which also feeds into this whole question of Risk Tolerance. We can look more closely at the work of Professor Gerald Wilde if we wish to explore this “stuff” further.

Emergency and health workers see the results of inapproriate risk tolerance every day. Yet their data is not relevant to any particular workplace, and they are not in a better position to manage behaviour than workers are.
Emergency and health workers see the results of inapproriate risk tolerance every day. Yet their data is not relevant to any particular workplace, and they are not in a better position to manage behaviour than workers are.

Practice does not make perfect

Each time you perform a task successfully does that make you less likely to have a system event, or failure. Most people intuitively consider that to be the case. Let me suggest to you, as loudly as I am able, that such is actually one of the biggest errors we make when trying to understand risk.

Each and every time you perform the same function, it is mutually exclusive of the previous attempt.

If you are playing a game which requires you to throw a dice, and to get to move forward you have to throw a six first. What are the odds? One in six probably.

Now assume that on your first throw you managed to come up with a four. You have to throw again; what are the odds this time. I shall leave that one up to you and come back to it later.

What is it about the world of risk and probability that people just do not seem to appreciate? By definition the Poker Machine is designed to keep more than it hands back. Such is a very simple concept to understand, or at least at a logical level we would have thought so.

The longer you sit at a Poker Machine feeding it, the more you are going to lose!

Take a roulette wheel. We see people coming up with all sorts of strategies to fool themselves into thinking that the risk has been modified. For example, some people always sit in a certain spot relative to the wheel, some people wear special items of clothing etc, some people only drink certain concoctions whilst playing, some people only play during certain hours of the day.

I am sure no matter what permutation you can come up with we would find someone who believes that this is their special method for increasing their luck. Or to put it another way reducing their risk of losing.

Now here is the scary bit. The majority of these people are quite intelligent functional human beings who are successful in many aspects of their lives.

What these examples demonstrate is how poor we are at understanding risk. Let me come back to the question of the dice. Each and every time you throw the dice the odds remain 1:6 against you. What your brain really tries to tell you though are things like “its about time”, after you have thrown eight times and the six finally came up.

How is that any different to sitting in front of the slot machine for two hours with thoughts such as “only a bit longer and I’ll hit the jackpot”.

For the most part we seem generally designed to minimise risk. We believe that that the more we are exposed to something the less likely that something shall cause us any concern.

Risk management is a flawed art

I propose that the fundamental basis for approaching risk in our workplaces is significantly flawed. Almost all global jurisdictions promote the standard risk matrix of probability /likelihood verses injury /outcome.

At the beginning of this commentary I suggested that we do not do the probability /likelihood very well. Indeed we are generally pretty awful at it. If we accept that to be true, it puts at risk the entire risk appraisal model. That should be terrifying.

The Transformational Safety accident causation model speaks to the lateral movement of protective barriers, and also theoretically tells us that risk is actually a dynamic process.

What we need to be doing is equipping ourselves, and our people, with skills and strategies to understand the dynamic nature of risk for the sole purpose of significantly improving their personal risk competency.

To not do so places us all at the mercy of Risk Tolerance. Not something I wish to confront again. No more Titanium please.

  • David G Broadbent is a Safety Psychologist, and founder of TransformationalSafety.Com, based in Australia. He has several clients in South Africa and Asia.


Related Posts Plugin for WordPress, Blogger...

The following two tabs change content below.
The Creator of the Worlds foremost Safety Culture Improvement Systems.

2 thoughts on “How to spot risk tolerance at work

  1. Dear David, thank you for this great article. I am most certain everyone in the safety space would find this beneficial. For decades we tried to predict and counter incidents at work, by only looking at employees’ behavior.
    However a crucial thing that one could not measure in each individual up until now, is indeed their Risk Propensity. By measuring one’s behavior within the working environment, and combining their Risk Propensity, we can almost completely eliminate accidents and injuries related to unsafe acts (violations), which amounts to 88% of all incidents at work. The tools are available, employers should just take action!
    -Kind regards, Bernard

    ==== Editor notes; Your comment raises a question; Once you have identified violations, or violators, or risk-prone workers, how do you manage these?
    Violations could never be eliminated, but violations are just symptoms of workplace culture.
    Some studies agree with your view that employee behaviour is a symptom of deeper causes; and that selection of ‘safe’ people, or discharge of ‘unsafe’ people, have minimal effects on safety culture.
    Nationality (what David labels geo-culture) does not contribute much to culture (based on my own anthropology research). There are safe and unsafe workplaces in every country.

    Most consultants do not measure individuals, but teams. Statistics are meaningless at individual level, but predictive at team level.
    Another psychologist who is a management consultant, based in SA, told me that he identifies and fixes what he could, and never shows management members their errors. They would just rationalise that knowledge by ego defence mechanisms (well studied by Freud and others). The strategic management consultant blames violations on system elements.
    Injured egos are hazardous. Sheq people are at risk of offending everyone. The phrase ‘I told you so’ erases any influence we may have had. -Edmond Furter

  2. Hi David, the importance of having the appropriate tools, skills, knowledge etc. to be able to spot risk cannot be denied. Science in safety, as in any other practice, is essential, BUT – never ignore your gut feel! Maybe it connects with good knowledge?

Comments are closed.

Facebook IconLinkedInLinkedInLinkedIn
error: Contact the Cygma Group for Copyright licence.