I became more attuned to the failure of how we think about risk tolerance after I was squashed by a car.
The consequences to me were over six months away from work, and over a dozen screws and titanium holding my left arm and shoulder together, writes David Broadbent.
The post-traumatic stress disorder that came with that experience was an added bonus.
Mad keen cyclists, such as myself, place ourselves in the “line of fire” several times every time we go out on the road.
In those countries where we drive on the left hand side of the road, every time we turn right we place ourselves at significant risk. In those countries that drive on the right hand side of the road, the higher risk comes whenever you turn left.
We would never encourage our employees to place themselves in the “line of fire”, yet something does not ‘click’ when it is an a-vocational, recreational pursuit.
The same argument applies to an entire area of workplace – home space hypocrisy.
For example I recently facilitated the Monadelphous Safety Conference in Perth. This is an engineering-based organisation with staff all over Australasia.
One of the questions I put to the audience was essentially “How often do you have your power tools ‘tagged'”.
Without exception all sites had their tools tagged at the appropriate time-frames. The next question, “Why”.
The answers; “To ensure the power tools are safe”. Absolutely spot on. The next question – and in my view the most critical – “How many of you have your power tools tagged for home use?”. No-one.
Ouch, this should be scary and goes to a major behavioural hypocrisy. It would appear that the very tools that our children might be using at home, don’t need to be tagged.
The people that we love so much we actually place at greater risk? We commonly find that we apply more of our “safety brain” at work, and then turn it off when we are at home. This is just plain wrong.
Another really critical factor here is the whole process of how we conduct risk assessments. There are a couple of issues that are fundamental to whether these things work or not. Personality is one of those.
I have lost count of the number of times I have done “risk assessments” within a workshop environment. When we conduct a group risk assessment, after the guys have done the same risk assessment individually, we get different results.
How intelligent is group intelligence?
Those different results quite often determine whether the task proceeds or not. Thus it is actually a critical discrepancy.
The group risk assessment number is often a product of the person in the group with the more robust personality or seniority.
Nothing to do with the actual risk – this should also be scary. This might be called an aspect of group-think?
Yet, another is our “body of knowledge”. We can only make decisions based upon what we know.
It is very difficult to actually acknowledge what we “don’t know”. I don’t want to sound too much like a philosopher but, “we don’t know what we don’t know” and that’s the point.
Real risk tolerance examples
Add to this the construct of “risk tolerance”. In other words, the more we do something and nothing happens, the greater the underlying “belief” that nothing will happen.
We thus develop the view that the “risk” has been overrated and the point at which we recognize “risk” actually moves.
This change in belief has actually nothing to do with the risk. It is a set of false beliefs that we build up over time.
In my world we call that the “normalisation of deviance” and is, in my view, one of the biggest challenges we have in safety.
One of the most powerful examples of this was the Longford Gas Explosion in Victoria, Australia. Alarms had become so common, that the system actually operated in “alarm mode” more often than not.
In addition, the auditory alarms associated with the actual system deviations were switched off. Deaths resulted.
Consider the Texas City explosion. The level indicator within the splitter tower was known to faulty, and this became accepted as normal. Deaths resulted.
Shortcuts raise risks
The Waterfall train disaster (NSW, Australia). The driver used a piece of timber to short-circuit the “deadman switch”.
He had a heart attack, the train accelerated and came off the tracks and multiple deaths occurred. The investigation found that multiple drivers used the same technique.
When I was telling that story in South Africa, a senior person from South African railways told me they knew their people did that as well. Normalisation of deviance kills again.
We truly need to be paying more attention to the fact that traditional approaches to risk assessment are not highly effective within certain zones of safety.
We need to understand that our ability to use risk assessment to predict what or when something is going to go wrong just does not work.
Nassim Taleb’s book. “The Impact of the Highly Improbable,” is considered to be one of the most influential books of the last fifty plus years. His thesis is exactly that.
While Taleb’s real claim to fame is within the financial world – his views have a very powerful resonance within our safety space.
We should leverage Talebs’ notion of the Black Swan to make us all so much safer.
Stay safe and well.
• David G Broadbent is a safety psychologist, and founder of TransformationalSafety.Com
Latest posts by David Broadbent (see all)
- How to spot risk tolerance at work - 19 July 2016
- How to spot major incidents and Black Swans in advance - 6 July 2016
- Manage safety culture, not safety recruitment - 2 February 2016