Minor incident statistics or ‘near misses’ are like amber traffic lights; we speed up and then hesitate, writes safety psychologist David Broadbent.
Let us look at some of the stuff underneath hazard and near miss reporting. I was approached to write on the nature of safety reporting, at a time when I was reflecting on an apparently difficult period in the mining and minerals extraction industries.
In Australia particularly there seems to have been a procession of mining fatalities over the last twelve months or so. Even in the week prior to my putting the fingers to the keyboard there has been two fatalities, numerous serious accidents, and a roof collapse in New Zealand.
I had also been spending a bit more time myself contemplating the true nature of ourselves, and some of the neuro-biology (brain function) underneath some pretty important aspects of decision making.
As an example, you are travelling in your car to home, less than five minutes from home. As you approach a very open, well-lit intersection, the traffic light turns from green to amber. What do you think first?
For almost ninety percent of people who contemplate this question, the first, instinctual answer is something like ‘could I make it?’ And the instinctual behavior is to speed up.
Only then something else kicks in, based on erring on the side of caution, and the learned response is to slow down.
Nearly all of us accept significant risk for minor and instant gain, even against our training and social compact in safety systems. Traffic is not a unique example.
Risk behaviour neuro-biology
Now there are also some pretty forceful neuro-biological pre-determinants that contribute to this social hypocrisy, and almost make it inevitable.
Suffice it to say, maybe we are inherently hard-wired to seek the easy way of least resistance and most gain. We are hard-wired to take safety shortcuts.
There is a constant battle in us between two distinct components of brain function that determine our decisions-making and behaviour. They are named the Amygdala, and the Frontal Lobe, or pre-frontal cortex.
It has a lot to do with the speed of information processing in the brain. Consider the Amygdala as functioning at the speed of a bright red Ferrari, while the Frontal Lobe has the pace of an ’83 white Volvo.
You can see which one is always going to win that race. Now here’s the kicker. The Amygdala is a part of our brain that handles emotional responses, while the Frontal Lobe is more responsible for the way we rationally process information, such as risk assessment and calculation.
In response to new information, we react based on emotion (Amygdala). If we are under pressure of any sort, such as an emotional desire to achieve goals, then we are emotional, which by some definitions is immature.
That in itself is dangerous and invite loss, injury, or toxic outcomes.
How do you tell the family that your employee died at work
Speaking of toxicity, fatalities in Australian mining in the last year or so may indicate rising loss, although we just don’t know from such small numbers, from say four to six, does not necessarily a percentage or a crisis make.
Fatality and serious injury percentages are insignificant, but they are significant events in the lives of the families and work colleagues involved.
It does not get any worse than having to walk up a driveway, knock on somebody’s front door, and tell them that their husband/wife, son/daughter, mother/father shall never take that walk again.
They have been killed in your employ. I know this from some surveys by TransformationalSafety.Com. When senior managers are given a range of scenarios, and asked to rank them in order of that which they pray they shall never have to confront, it is that long walk up the widow’s driveway.
Ninety-seven percent of respondents among 1687 managers, place this at the top of their list of workplace nightmares.
Regretfully this universal, cross-cultural fear does not appear to translate into senior leadership behaviour in many environments.
Near misses, alarm bells, incident statistics
Consider the Upper Big Branch explosion in the United States, which killed 29 miners. There had been a number of issues reported to supervision which, it turns out, were ignored.
Local site leaders were required to report tonnages to the mine owners in Charleston every 30 minutes!
What message does that send to the operational workforce? Do you think workers would have any confidence with respect of any safety hazards that might be reported?
What about the Pike River explosion in New Zealand, which killed over two dozen miners. Again we find that the conclusions of the Commission of Enquiry demonstrated that that there were numerous warning signs of things not being right, and these were consistently ignored, despite being reported.
Ignored not only by local management, but by the regulatory authorities as well.
The Deepwater Horizon Gulf of Mexico explosion followed a discussion, described by witnesses as heated, between some drilling contractors (Halliburton) led by Dewey Revette, and a senior client representative of BP.
The contractors were reporting safety concerns with the process. These guys had many years of offshore drilling experience between them.
The BP guy was heard to say, ‘Well this is how it’s going to be’, and the Halliburton guys were seen to reluctantly agree. Those guys are dead.
Douglas Brown, Chief Mechanic on the Deepwater Horizon, said the senior official on the rig was heard to mumble, “Well, I guess that’s what we have those pinchers for”; which he took to be a reference to devices on the blowout preventer, the five-story piece of equipment that can slam a well shut in an emergency.
Guess what, the blowout preventer failed. It is always fatalistic rely on the final barrier in a perceived serial sequence of accident or disaster prevention.
By the way, many disasters are a product of an apparent random correlation of contributory factors.
Seeing accident causation as a sequential, linear, domino process, is flawed.
This draws us back to the very real question about the increased necessity to get “inside” what is happening in a process, and to try and get ahead of the apparent entropy.
Incident statistics, accidents or disasters may appear to be random and outside managerial control, but they are not.
How often have you investigated an accident and a significant number of your people say words to the effect of, “I knew that was going to happen, just not sure when”.
Even major disasters, like small incident statistics, are manageable and largely preventable.
Workers are sensitive to risks and expectations
The people who know the most about what is going on, on the surface and beneath, are the operational workers.
The technical guys may know how something is designed, its parameters, etc, but workers who are exposed become sensitive to even the slightest deviations in plant, systems, and in management. They are good at finding out what management really wants.
When I was a metallurgist at BHP in the early 1980s, we had a special machine that you could place against a piece of steel to determine its composition.
This was used if there was any concern that the process may have been contaminated, and the wrong grade of steel could be sent to the customer.
It was a long and cumbersome process. We had guys on that plant who could perform the same function by sight; they touched it with an angle grinder and looked at the colour of the sparks.
We must manage safety deviations
Where this leads us, is the necessity to significantly increase the reporting of potential hazards, near misses, deviations, etc.
Failure to do so shall result in unnecessary death, disaster, life-changing injuries and occupational diseases.
Workers are the key to gaining this level of information. We also know that the majority of operational workers keep this stuff to themselves. Why?
Supervisors and managers ignore warnings, but do not intend to kill workers. They just ascribe it to chance.
Workers are often conditioned to not report risks. In the language of psychology we might say that desire to report is extinguished by lack of recognition for reporting.
They face an iceberg of indifference to health and safety. We all know the damage that the invisible part of icebergs can do to massive investments.
One of the first steps an organisation must take to increase hazard and near-miss reporting, is to treat such reports with respect, and to respond.
They must keep the reporter informed. This is a simple and obvious process, yet the bulk of organisations cannot even get this right.
I have lost count of the number of organisations I have visited who were concerned that their Take-5, Step-Back, etc programs were not working. In every case the underlying reason was the iceberg of inertia, and lack of feedback.
They were asking for information, but giving none or very little back.
It does not take long for people to realise if these interventions are administrative and political tools.
The failure of reporting does not lie at the feet of workers, but at the feet, and up to the armpits, of managers and organisational culture.
On sites that have better reporting, the answer is often again glaringly obvious, as I found. Local management at these sites act as buffers from some of the less helpful messaging from above, and create an island of influence.
They implement the very basic models of didactic communication, and made it clear that they embrace, encourage, and value reporting of potential hazards and near-misses. They are alert to amber lights.
They value reports and reporters, and immediately act. Stay safe and well.
• David G Broadbent is a safety psychologist and founder of TransformationalSafety.Com.