How to get minor incident statistics

Minor incident statistics or ‘near misses’ are like amber traffic lights; we speed up and then hesitate, writes safety psychologist David Broadbent.

Let us look at some of the stuff underneath hazard and near miss reporting. I was approached to write on the nature of safety reporting, at a time when I was reflecting on an apparently difficult period in the mining and minerals extraction industries.

In Australia particularly there seems to have been a procession of mining fatalities over the last twelve months or so. Even in the week prior to my putting the fingers to the keyboard there has been two fatalities, numerous serious accidents, and a roof collapse in New Zealand.

I had also been spending a bit more time myself contemplating the true nature of ourselves, and some of the neuro-biology (brain function) underneath some pretty important aspects of decision making.

As an example, you are travelling in your car to home, less than five minutes from home. As you approach a very open, well-lit intersection, the traffic light turns from green to amber. What do you think first?

For almost ninety percent of people who contemplate this question, the first, instinctual answer is something like ‘could I make it?’ And the instinctual behavior is to speed up.

Only then something else kicks in, based on erring on the side of caution, and the learned response is to slow down.

Nearly all of us accept significant risk for minor and instant gain, even against our training and social compact in safety systems. Traffic is not a unique example.

Risk behaviour neuro-biology

Now there are also some pretty forceful neuro-biological pre-determinants that contribute to this social hypocrisy, and almost make it inevitable.

Suffice it to say, maybe we are inherently hard-wired to seek the easy way of least resistance and most gain. We are hard-wired to take safety shortcuts.

There is a constant battle in us between two distinct components of brain function that determine our decisions-making and behaviour. They are named the Amygdala, and the Frontal Lobe, or pre-frontal cortex.

It has a lot to do with the speed of information processing in the brain. Consider the Amygdala as functioning at the speed of a bright red Ferrari, while the Frontal Lobe has the pace of an ’83 white Volvo.

You can see which one is always going to win that race. Now here’s the kicker. The Amygdala is a part of our brain that handles emotional responses, while the Frontal Lobe is more responsible for the way we rationally process information, such as risk assessment and calculation.

In response to new information, we react based on emotion (Amygdala). If we are under pressure of any sort, such as an emotional desire to achieve goals, then we are emotional, which by some definitions is immature.

That in itself is dangerous and invite loss, injury, or toxic outcomes.

How do you tell the family that your employee died at work

Speaking of toxicity, fatalities in Australian mining in the last year or so may indicate rising loss, although we just don’t know from such small numbers, from say four to six, does not necessarily a percentage or a crisis make.

Fatality and serious injury percentages are insignificant, but they are significant events in the lives of the families and work colleagues involved.

It does not get any worse than having to walk up a driveway, knock on somebody’s front door, and tell them that their husband/wife, son/daughter, mother/father shall never take that walk again.

They have been killed in your employ. I know this from some surveys by TransformationalSafety.Com. When senior managers are given a range of scenarios, and asked to rank them in order of that which they pray they shall never have to confront, it is that long walk up the widow’s driveway.

Ninety-seven percent of respondents among 1687 managers, place this at the top of their list of workplace nightmares.

Regretfully this universal, cross-cultural fear does not appear to translate into senior leadership behaviour in many environments.

Near misses, alarm bells, incident statistics

Consider the Upper Big Branch explosion in the United States, which killed 29 miners. There had been a number of issues reported to supervision which, it turns out, were ignored.

Local site leaders were required to report tonnages to the mine owners in Charleston every 30 minutes!

What message does that send to the operational workforce? Do you think workers would have any confidence with respect of any safety hazards that might be reported?

What about the Pike River explosion in New Zealand, which killed over two dozen miners. Again we find that the conclusions of the Commission of Enquiry demonstrated that that there were numerous warning signs of things not being right, and these were consistently ignored, despite being reported.

Ignored not only by local management, but by the regulatory authorities as well.

The Deepwater Horizon Gulf of Mexico explosion followed a discussion, described by witnesses as heated, between some drilling contractors (Halliburton) led by Dewey Revette, and a senior client representative of BP.

The contractors were reporting safety concerns with the process. These guys had many years of offshore drilling experience between them.

The BP guy was heard to say, ‘Well this is how it’s going to be’, and the Halliburton guys were seen to reluctantly agree. Those guys are dead.

Douglas Brown, Chief Mechanic on the Deepwater Horizon, said the senior official on the rig was heard to mumble, “Well, I guess that’s what we have those pinchers for”; which he took to be a reference to devices on the blowout preventer, the five-story piece of equipment that can slam a well shut in an emergency.

Guess what, the blowout preventer failed. It is always fatalistic rely on the final barrier in a perceived serial sequence of accident or disaster prevention.

By the way, many disasters are a product of an apparent random correlation of contributory factors.

Seeing accident causation as a sequential, linear, domino process, is flawed.

This draws us back to the very real question about the increased necessity to get “inside” what is happening in a process, and to try and get ahead of the apparent entropy.

Incident statistics, accidents or disasters may appear to be random and outside managerial control, but they are not.

How often have you investigated an accident and a significant number of your people say words to the effect of, “I knew that was going to happen, just not sure when”.

Even major disasters, like small incident statistics, are manageable and largely preventable.

Workers are sensitive to risks and expectations

The people who know the most about what is going on, on the surface and beneath, are the operational workers.

The technical guys may know how something is designed, its parameters, etc, but workers who are exposed become sensitive to even the slightest deviations in plant, systems, and in management. They are good at finding out what management really wants.

When I was a metallurgist at BHP in the early 1980s, we had a special machine that you could place against a piece of steel to determine its composition.

This was used if there was any concern that the process may have been contaminated, and the wrong grade of steel could be sent to the customer.

It was a long and cumbersome process. We had guys on that plant who could perform the same function by sight; they touched it with an angle grinder and looked at the colour of the sparks.

We must manage safety deviations

Where this leads us, is the necessity to significantly increase the reporting of potential hazards, near misses, deviations, etc.

Failure to do so shall result in unnecessary death, disaster, life-changing injuries and occupational diseases.

Workers are the key to gaining this level of information. We also know that the majority of operational workers keep this stuff to themselves. Why?

Supervisors and managers ignore warnings, but do not intend to kill workers. They just ascribe it to chance.

Workers are often conditioned to not report risks. In the language of psychology we might say that desire to report is extinguished by lack of recognition for reporting.

They face an iceberg of indifference to health and safety. We all know the damage that the invisible part of icebergs can do to massive investments.

One of the first steps an organisation must take to increase hazard and near-miss reporting, is to treat such reports with respect, and to respond.

They must keep the reporter informed. This is a simple and obvious process, yet the bulk of organisations cannot even get this right.

I have lost count of the number of organisations I have visited who were concerned that their Take-5, Step-Back, etc programs were not working. In every case the underlying reason was the iceberg of inertia, and lack of feedback.

They were asking for information, but giving none or very little back.

It does not take long for people to realise if these interventions are administrative and political tools.

The failure of reporting does not lie at the feet of workers, but at the feet, and up to the armpits, of managers and organisational culture.

On sites that have better reporting, the answer is often again glaringly obvious, as I found. Local management at these sites act as buffers from some of the less helpful messaging from above, and create an island of influence.

They implement the very basic models of didactic communication, and made it clear that they embrace, encourage, and value reporting of potential hazards and near-misses. They are alert to amber lights.

They value reports and reporters, and immediately act. Stay safe and well.

• David G Broadbent is a safety psychologist and founder of TransformationalSafety.Com.

Related Posts Plugin for WordPress, Blogger...

The following two tabs change content below.
The Creator of the Worlds foremost Safety Culture Improvement Systems.

2 thoughts on “How to get minor incident statistics

  1. Good article, sound logic (says my Amygdala), however, this is what my frontal lobe thinks… We hear this a lot in safety management of the Heinrich triangle and all that of near misses’ importance and its role in preventing big incidents. But it is like saying the target for injuries should be zero; it is not practically implementable. (i think this site had an article on just that, some months ago)

    Consider the Heinrich triangle: every 300 minor/near misses results in 1 LTI. How does anyone have time and resources to investigate 301 incidents, when we are not even good enough to prevent recurrence of the ‘ones’ so often? It is a fallacy and cannot be done: resources are just not there!

    Management are often publicly saying Safety is no.1, yet how many companies actually allocate resources to H&S? Most companies leave that for HR to do and does not even have a dedicated person at operational level, let alone at executive level! Point is: until companies put their money where their mouths are, this obsession with zero and near-misses is but the stuff of textbooks.

    This coming from someone lucky enough to work for a company that actually has senior and operational dedicated staff to H&S, but even for me with the resources, the flood of incidents is too much.

    ==== Editor notes; UPDATE; The OHS Practitioner curriculum standard draft by the MQA as mining Seta, for the QCTO, for mining and industry, under skill 226302-001-KS-03:02; Techniques of accident and incident investigation (Intermediate) (Credits: 3), Topic Elements to be covered include: “1. Explain the difference between accidents and incidents; (NQF Level: 4) 2. Use examples to explain the interrelationship of accidents and incidents giving the typical statistical model of how incidents eventually lead to accidents; (NQF Level: 4).”
    Is this Henrich all over again? I seem to recall that I informed the drafting forum that Heinrich is false, but they dragged him back in again. Peprhaps from a university textbook? So I am trying again with this comment to the MQA; “The reference is probably to the Heinrich /Bird incident statistics triangle, which was found to be without scientific validity, and based on false data. Several researchers have pointed out that the frequencies of minor and major incidents are not linked. There is a generally apparent ratio, but that varies greatly between sites, companies, industries, and countries. The assumptions behind this element is wrong and outdated. The assumed distinction between ‘accidents’ and incidents is also outdated, since the difference is only in severity, and is largely due to chance.” -END OF UPDATE.

    Heinrich’s statistical triangle, and Bird’s applications of it, was revealed to be mere assumption, as David Broadbent has also pointed out. Otherwise you raise a valid point. I agaree that our conscious logic is often wrong.
    However some practitioners say they get data from up to 500 observations per month, and make good use of the information they derive from the data. Very few incident investigations, of course.
    If your HR people run Sheq, and your managers talk of Heinrich stats, be afraid.

  2. Great article, I am currently working for the company whereby Executive members are very supportive in the implementations of health and Safety awareness in workplace, last year LTIs were 15% but this year are currently sitting at 3.5%. what I am trying to voice here is, once you get 100% supports from Executive, nothing is impossible.

Comments are closed.