Swiss-cheese model

Reason's accident causation model

The "Swiss-cheese model" is a widely adopted model in human factors for explaining accident causation. The model has taken on different representations over the years. Images 1 & 2 seem to be the representation most often seen in healthcare. "The Swiss cheese model hypothesises that in any system there are many levels of defence, such as checking of drugs before administration, a preoperative checklist or marking a surgical site before an operation. Each of these levels of defence has little 'holes' (known as 'latent conditions') in it which are caused by poor design, senior management decision-making, procedures, lack of training, limited resources, etc. If latent conditions become aligned over successive levels of defence they create a window of opportunity for an incident to occur. They also increase the likelihood of making 'active errors'. When a combination of latent conditions and active errors breaches all levels of defences, a patient safety incident occurs." (paraphrased from Carthey & Clarke, 2010, p.61).

This current version of the model is different from the first version in several aspects (see an adapted version of this first version as image 3):

  • Firstly, it has adopted a more "cheesy" look, thus exploiting the potential for easy remembering and its use as a convenient heuristic. And yet these "cheese" slices have lost any description, making the model rather simplistic and of little practical relevance as, in principle, anything can be a layer of defence.
  • Secondly, all layers in the model have become defences. In the first model, defences referred to physical, procedural, administrative or other barriers set up to capture active errors. Although the idea of all layers being defences is not in itself inadequate, it has rested importance to those last defences in depth that can be designed between an active error and its consequences.
  • Thirdly, the current version has also lost the hierarchical depiction of a system, substituting it for a more horizontal depiction. This horizontal depiction describes a process or a department better, and rests importance to the role of the systems' hierarchy. While the first version explained well how an operator may inherit errors made at higher echelons in the hierarchy (culture, decision-makers, managers, etc), the current version unwittingly places responsibility for safety back on single hierarchical layers (typically, the operators'), wipping out the contribution that a systems approach has made over the last decades.
sw_not_aligned.gif
sw_aligned.gif
(Images 1 & 2 embedded from Duke University Medical Center on 31 October 2011)
The basic structural elements identified in the first version of Reason's (19903) model are the following:
  • Decision makers. These include high-level managers, who set goals and manage strategy to maximize system performance (eg, productivity and safety).
  • Line management. These include departmental managers, who implement the decision makers' goals and strategies within their areas of operation (eg, training, sales, etc).
  • Preconditions. These refer to qualities possessed by people, machines and the environment at the operational level (eg, a motivated workforce, reliable equipment, organizational culture, environmental conditions, etc).
  • Productive activities. These refer to actual performance at operational levels.
  • Defenses. These refer to safeguards and other protections that deal with foreseeable negative outcomes, for example by preventing such outcomes, protecting the workforce and machines, etc.

Accidents occur because weaknesses or 'windows of opportunity' exist or, else, open up at all levels in the system, allowing a 'chain of events' to start at the upper echelons of the structure and move down, ultimately resulting in an accident if it is not stopped before such thing occurs. Said otherwise, most (if not all) accidents can be traced back to weaknesses at all levels in the system, including the decision makers level. (Perezgonzalez et al, 20112.)

m2f4.jpeg
(Image 3 embedded from CrewResourceManagement.net on 31 October 2011)
References
1. CARTHEY Jane & Julia CLARKE (2010). Implementing human factors in healthcare. 'How to' guide. Patient Safety First (UK), 2010.
2. PEREZGONZALEZ Jose D, ZiZhang NG & Abdul M YOOSUF (2011). Accident causation model. Journal of Knowledge Advancement & Integration (ISSN 1177-4576), 2011, pages 25-27. Also retrievable from Wiki of Science.
3. REASON James T (1990). Human error. Cambridge University Press (New York, USA), 1990. ISBN 9780521314190.

Want to know more?

Iceberg heuristic Human Factors Analysis & Classification System

Author

Jose D PEREZGONZALEZ (2011). Massey University, New Zealand (JDPerezgonzalezJDPerezgonzalez).


Other interesting sites
320
Journal KAI
105px-Stylised_Lithium_Atom.png
Wiki of Science
120px-Aileron_roll.gif
AviationKnowledge
Artwork-194-web.jpg
A4art
Artwork-162-web.jpg
The Balanced Nutrition Index
Unless otherwise stated, the content of this page is licensed under Creative Commons Attribution-ShareAlike 3.0 License