Skip to content

Sheila Pantry Associates Ltd

Focus

Focus Archive

Don't oversimplify accidents, by Roger Bibbings, RoSPA

February 2010

If you really want to prevent accidents you need to investigate and understand their causes as fully as possible, says UK Royal Society for the Prevention of Accidents (RoSPA) Occupational Safety Adviser, Roger Bibbings.

There are two very common errors in the way people perceive accidents. Either they see them at one extreme, as rare, chance chains of events and circumstances which are beyond both prediction and control or, at the other end of the spectrum, they see them as quite simple events with single causes.

In reality nearly all accidents - even mundane ones - turn out to be as complex as they are preventable. But failure to understand the complexity of accidents leads both to missed opportunities for prevention as well as over-simple and inappropriate prescriptions to improve safety and reliability.

One of the reasons for an over-simple, uni-causal view of accidents among many non-experts is that few lay people have ever had to undertake an accident investigation. Investigation is not only challenging in itself but can be a great learning experience for those involved. Sadly, however, it tends to remain the preserve of 'experts' - even though team-based investigation, led by managers with worker involvement, offers many more opportunities for combining skills and insights as well as securing collective buy-in to findings and recommendations. (See www.rospa.com/occupational-safety/advice/safety-failure)

Over the last nine years, the need to enhance the quality of learning from accidents has been one of RoSPA's key occupational safety and health issues. The truth be told, however, it has been one of the hardest of all our policy priorities on which to make progress. Most businesses seem to think they investigate accidents quite adequately. But, although there are many organisations that do have really effective approaches to investigation and organisational learning from accidents, sadly the reality is that the vast majority of accidents are not investigated at all - especially in Small and Medium Enterprises (SMEs). Often this failure is accompanied by little or no reporting of near misses. And, where accidents are investigated, there are usually major barriers to real learning.

The aftermath of an accident is hardly a comfort zone for anyone. People feel bruised, angry and vulnerable, especially where accidents are followed by a long enforcement and compensation trail that can often take years to settle. And, because enforcers or claimants are seeking to pursue their respective agendas the focus in their investigations is invariably on attributing fault and blame, which in turn creates an adversarial atmosphere that poisons relationships and prevents openness and learning. And even where the organisation does try to learn lessons, there are usually some fairly fundamental errors in the approach adopted.

Firstly, because serious events warranting extensive investigation are few and far between, the habits of investigation are often lacking; there may be no clear procedures, a lack of clarity about purposes, little/no managerial or worker involvement, and no way of scaling investigation effort or agreeing terms of reference with the commissioners of the investigation - usually senior managers. And at the end of the process there is often little (or no) communication of lessons learned (for advice on operational readiness to investigate visit: www.nri.eu.com).

Secondly, investigation can be distorted and stunted by the (often unconscious) operation of 'stop rules' (stopping the investigation as soon as an apparent cause is found) and biases (seeking only that evidence that confirms a pre-existing theory about what happened and why).

Thirdly, investigators often do not go back far enough in their evidence search to establish in detail, not just what happened in the last five seconds before the accident but what was happening in the last five hours, five days, five weeks or five months beforehand. Fourthly, there is often little use of structured methods to integrate evidence and no examination of underlying job, technological or organisational factors. This invariably leads to blaming the victim.

And underlying all this there tends to be a weak understanding of human error and the various ways in which this manifests itself in accidents.

Blame

Despite the focus on the complexity of accidents in the current teaching of H&S professionals, there are still far too many practitioners who seem to be easily drawn to simple explanations and quick fixes. After all, it is often much easier in an investigation to point the finger at poorly performing individuals as being the prime authors of their own or others' misfortunes - rather than to seek out other potentially more significant causal factors. Over-simple explanations in turn stimulate a ready appetite for trying to improve safety performance by harsh discipline, retraining, repetitive behavioural safety (BS) programmes or even use of screening techniques to weed out allegedly less reliable individuals.

Respected figures in the health and safety field such as Richard Booth and Trevor Kletz have observed wryly that to say over 95 per cent of accidents are caused by human error is about as useful saying that 100 per cent of falls are caused by gravity! Of course any serious student of accident causation has to acknowledge that errors by individuals do indeed form part of the explanation of most accidents. After all, major disasters such as Piper Alpha, Bhopal and Chernobyl were all initiated by the erroneous actions of teams and/or individuals. Yet subsequent investigations showed that the most significant causes were firmly rooted in weaknesses in technical and management systems.

In the case of Chernobyl, any suggestion that person factors and not the Soviet nuclear system were the root cause of the accident would have been seen as utterly one-sided if not ridiculous. On the other hand, Chernobyl in particular led to an acceleration of interest worldwide in human factors in safety and a new search for ways to ensure the reliability of the man/machine interface.

Over the last three decades, particularly as a result of the work of people such as James Reason, we have made big steps forward in our understanding of human factors and their relationship to other kinds of pathogenic weakness in organisations. Human error itself is understood as a complex phenomenon, comprising: 'slips' and 'lapses'; 'mistakes' (which can be skill and/ or rule based); and 'violations'. The latter can be 'exceptional', 'routine' and 'situational' - as explained in the UK Health and Safety Executive (HSE) guidance on human factors - HSG48.

In turn these forms of error can combine in various ways (e.g. Steve Stradling's dictum 'Error plus violation = crash!') together with job and organisational factors. Other 'person' factors in accidents can include not just things such as personal attitudes to safety but impairments (poor general health, poor sleep quality, poor eyesight, stress, drugs, alcohol etc) and distraction factors (poor communications, monotony, information or task overload, interruptions etc).

Despite all this complexity the assumption that most accidents are simply the result of carelessness and/or wilful rule breaking just will not fade away. Currently there seems to be a worrying resurgence of interest in the idea of 'accident-proneness'. The suggestion here is that some people are more likely to have accidents as a result of absent-mindedness, clumsiness, carelessness, impulsivity or a predisposition to risk-taking. Although substantial research has been devoted to this subject, a number of other studies have cast doubt on whether 'accident-proneness' actually exists as a distinct, persistent and independently verifiable syndrome. It was largely discounted, for example, by the Robens Committee in 1972 as a result of the evidence submitted by Andrew Hale and others.

If empirically it seems that some people in certain settings do have more accidents than others, this is certainly not explained just by personality traits. Personality variables are no doubt important but individual 'injury proneness' is just as likely to be associated with a variety of other non-personality factors such as inexperience, lack of knowledge and skill, younger age, poor sleep experience and physical fitness, eyesight etc. Emotional factors too can also contribute to accident involvement: for example, depression or anxiety which can result in loss of concentration etc.

The idea of the careless or accident prone worker also lies behind the attraction which many safety practitioners feel towards behavioural safety (BS) programmes. Such programmes clearly have their place but not as stand-alone interventions and only once proper health and safety management systems are well established. Typically BS programmes tend to target violations. Yet often in the training of BS observers, trainers can neglect to explain that violation is only one error type and that in turn violations can interact with many other factors. So in the context of an investigation, just concentrating on rule breaking without looking at everything else is not really the best way forward (see HSG48). Often BS programmes are introduced on the assumption that all the options for improving safety by improving technology or organisation have been exhausted. And yet time and again investigations show that there are still many things which could have been done to make work equipment or management systems safer. It was back in the 1980s that HSE research showed that about 75 per cent of accidents were due to failure by management to put reasonably practicable measures in place and under a quarter were due to failure by employees to follow procedures.

Investment

Accident causation is multi-factorial and thus by its very nature safety solutions must be multi-stranded. So there is always a danger - particularly if investigators do not fully understand this - that they will tend too easily to reductionism, ascribing disproportionate significance to those causal explanations which support the case for safety prescriptions that lie within their particular intellectual comfort zone.

Learning from accidents is not easy. Accidents test organisations and they test relationships. Above all they should test our assumptions and prejudices.

A final point to remember is that accidents should always be regarded as an investment. You've spent time, trouble and money having them. So logically you should try to maximize your return by learning lessons that can enhance safety and prevent recurrence.

Acknowledgements to Roger Bibbings, RoSPA for allowing this article to be published here. It first appeared in RoSPA's Occupational Safety and Health Journal, December 2009. www.rospa.com