Everyone on site has a legal and moral responsibility for preventing accidents at work, but the dice are loaded against stopping production on gut feeling.
The dilemma of when and how to ‘stop the bus’ also applies to us as occupational health, safety, environmental and quality (Sheq) practitioners, writes Sheqafrica.com editor Edmond Furter.
Our reputation for being health freaks, occasional alarmists and greenies, could count against our influence in preventing accidents at work and at home, so we have to rely on our track record of having saved some skins and budget with our toolbox of risk profiles, incident statistics, case studies, probabilities, case law, ethics and cultural nudges.
We have to sell the unpopular, unpalatable, joyless, inhumane and contra-instinctual attitudes of prevention, caution, ‘what if’, the long way around, reading the manual first, ‘doing the math’, compliance.
Sometimes we have to remind some managers that we knew the contributing factors to an incident, that could be construed as a way of saying ‘I told you so’, or ‘you did not care’.
Selling safety in prables
We often sell our terminology and world view to a world that would forever speak of accidents, human error, danger, and the domino effect. From the corporate world I have learned to use standard business terminology when speaking to the proverbial brick wall.
I will refer to accidents, human error, blame and guilt, if that would get the message through the corporate static.
We sell safety partly by our track record, quantifiied against proven quantities of budget, insurance, litigation, and reputation. This data need not come from our own site.
Do not change popular imagination, use it
Sometimes we sell safety based on minor incidents, which most managers, and some occupational health and safety professionals, believe to be statistically linked to major incidents, which in some ways they are and in some ways they are not, but how Heinrich faked the stats for his triangle is an academic question, not a motivational question.
From domestic incidents I have learned that all people are reminded of warnings after minor incidents. My New Year’s resolution is to raise awareness of regular minor incidents, and their perceived links to rare but serious incidents.
Statistics (and insurance actuarial science) do not make that connection, but people do, and people change their behaviour based on risk perception.
Another branch or a competitor’s data could be more effective than our own, as parables in legend, spiritual books, literature and movies prove.
Stopping production usually requires one incontestable fact, law, standard, deduction, or status (rank), and we, or the worker who spots the risk, often do not have instant recourse to the switch, so it takes a large dollop of attitude and culture to raise the point that could save our colleagues’ skins.
Of course we are legally and morally obliged to first personally duck the risk before raising the alarm. There is no standard for the length of the sting of evidence, deduction or persuasion before we ‘stop the bus’.
Better prevent than investigate incidents
I replayed some old episodes of an aviation safety investigation TV series, named Black Box, during the holiday. Forensics is fascinating, as Sir Arthur Conan Doyle discovered when he re-invented detective novels or ‘whodunits’.
They are fascinating for the feeling of superiority that comes with retrospect. We are programmed to believe that we could have predicted what would have happened, if only we were there to warn everyone.
In fact, we usually are present and able to warn everyone, but we are equally programmed to follow peer pressure and culture, which tends to be risk tolerant. Injuries from hunting down wounded buffalo, and war casualties, are among the irrefutable testimonies to our nature.
Some risk perception traps
From aviation I have learned that our death traps are often in highly local variable factors, such as weather, metal fatigue, management and worker behaviour.
From the power generation and waste management industries I have learned that our natural tendency is to consider various deviations in isolation. A water leak and a perpetual coal stockpile fire, or waste fire, are seldom linked until a geological fault is revealed to stoke the fire with hydrogen and oxygen.
We tend to box plant design, re-design, maintenance, operating procedure, training, material, process, shift handover, and many other functions, without tracing even some of the obvious implications that each have on the others.
The ‘not my job’ syndrome and ‘not my fault if it fails’ syndrome are among the killers that live in our blind spots.
From the training and defence industries I have learned that we tend to trust lecturers and avoid the hassle of doing the reading and math to apply their material to our jobs, sites, processes and conditions.
The tickbox and safety file syndrome is another of those defence mechanisms, truly proactive, but only in the sense of covering our own butts in advance.
An inspection or safety file that potentially serves only our ‘I told you so’ syndrome, is as useless as Sherlock Holmes after the incident.
• Edmond Furter is the editor of Sheqafrica.com
• To join our Sheq contributors, Christelle Fouche, David Broadbent, Mabila Mathebula, Gerrit Augustyn, Ria Swanepoel, Celeste Erasmus, Shane Lishman, Rudy Maritz, Brett Solomon, Doug Michell, Ben Fouche, and posts from organisations such as Saiosh, Nioccsa, Achasm, Buildsafe SA, Safebuild /Master Builders Associations, Department of Labour, Compensation Commissioner, and Department of Enviornmental Affairs, send your Sheq research results, practical advice, blog, or comment to email@example.com
• Readers may also comment on any article or blog in the Comment column.
Latest posts by sheqafrica (see all)
- Seven reasons why LTIFR does not impress - 21 April 2017
- NEWS: Sheqafrica.com – 10 years of loyal service to the SHEQ community - 19 April 2017
- So, you want to be a consultant? - 19 April 2017