Towards the Identification of Prototypical Risk Situations in Anaesthesia as a Complex System

 

A.S. Nyssen, Work Psychology Department, University of Liege, Belgium

 

Keywords: Human error, Systemic Analysis, Prototypical Risk Situations

 

Abstract

 

Accident investigations in medecine as in other complex domains tend to attribute most adverse outcomes to human error. Human error is defined in terms of act deviating from normal and accepted safe practice or working conditions. This is not the case in many clinical situations. In order to better undertsand the root causes of accidents in clinical systems, we analysed 30 problem situations in anesthesia. We complemented classical accident analyses, based on the external contributory factors, with a cognitive analysis, based on the decision functions involved in the problem situation. The results revealed that the critical phases in the decision making process is far from being always the diagnosis. Most often cited was failure to perceive information during surgery. The analysis of our data also revealed different cognitive difficulties with different degrees of expertise. This analytical perspective is interesting if we want to predict some cognitive failures connected with some particular work conditions ( prototypical risk situations) and limit the risk through some means as training, technical or organizational improvements.

 

Introduction

 

Human error has been proposed as the most frequent cause of critical incidents in medicine as in other complex domains (refs. 1-3). Accident investigations often classify accidents into exclusive categories: human error, technical factor or other complication. Human error is defined by the criterion of "performance which deviates from the norm". This definition implies a well-structured task sequence performed in stable and predictable work conditions (ref. 4). This is not the case in many medical situations where there is high uncertainty. In anaesthesia for instance, it seems that at least 20 % of anaesthesia cases involve a problem event requiring intervention by the anaesthetist, while around 5% of cases involve a potentially catastrophic event (refs. 5, 6). Moreover, the patient’s reactions are not always predictable and

his clinical state evolves constantly. From this, the medical procedures will necessarily be less specific than in industrial environments and the process of decision-making should be analyzed in relation to the dynamic context. Under such conditions, the usual convention to define human error in relation to the norm is difficult to apply. Furthermore, the classification of medical accidents into exclusive categories appears irrelevant in a clinical system in which humans, technology and organization are more and more interdependant. Understanding how this complex system poses demands on the physicians, and how the physicians meet those demands seems more important for the prevention and management of accidents than statistical and epidemiological data. This implies to explore the knowledge, attentional and contextual demands that operate in the field and which influence the behaviours and the decision-making process of the practitioners. Seen in this perspective, accident data provide valuable information about how the system in which practitioners are embedded functions and dysfunctions.

 

In general, accident reporting forms used by studies on risk in medicine include: data of the patient, type of surgery, detailed report of the incident and appreciation of the associated factors (refs. 7, 8). Although this information on what we term "extrinsic" factors is the most objective way of collecting information about critical events, it may not be sufficient when proposing an accident reduction strategy considering the evolution of the physician's task towards greater cognitive tasks. This evolution is due to the introduction of computerized technology in the hospital: for instance, in today's OR, the anaesthetist supervises, detects and regulates the case essentially from electronic displays. By providing more precise information on the patient, this technology facilitates the diagnosis and control of the patient but it is at the expense of a more elevated mental cost. This observation stresses the importance of analyzing critical events at the cognitive level in order to discover the difficulties encountered by the physicians in this man-machine system and to adapt training program and technology.

 

Several researchers emphasized the need for accident analyses to go beyond the superficial level and the particularness of an accident. (refs. 9, 10). Points of commonalty among accidents requires to use a mechanism for abstracting some kind of patterns among accident data. Holnagell (ref. 11) stimulated this search by making a distinction between error phenotype or domain description and error genotype or underlying cognitive mechanisms. Reason's latent failure model (ref. 12) has suggested that a variety of common latent factors present in most organizations can potentially degrade safety.

 

In this paper, we carried this construct further and explored the idea of prototypical risk situations that linked some kind of human performance issues that we have observed with some attributes of modern man-machine systems. We used the domain of anaesthesia as a laboratory for collecting, analyzing and learning about the prototypical risk situations. We will attempt to demonstrate how the concept of prototypical risk situation provide a mechanism for abstracting common patterns of breakdowns in complex systems. Although anaesthesia is usually referred to as a procedure for putting the patient into an unconscious state so that the surgery is possible, the action of the anaesthetist is highly contextual : it lies within an evolving situation marked by a transformation of the patient's state and by the surgeon's actions. Thus, it shares the common profile of complex systems as described by Woods (ref. 13): high dynamism and uncertainty, time pressure, ill-formed problems, complex human-machine interactions and risk.

 

Material and Method

 

The collection of accident cases was carried out at the University Hospital of Liège (Belgium). Between January, 1995 and April, 1996, 30 problem situations were collected by a senior anaesthetist. The selection also comprises cases reported by anaesthetists in training outside of the University Hospital in 16 other peripheral institutions. Each of these cases satisfied the four criteria : 1) the incident constituted a potential threat to the safety of the patient at one or several moments during its course; 2) the actions of one or several anaesthetists played a critical role in the evolution or the consequences of the incident; 3) the anaesthetist who experienced the situation was willing to present the situation to his peers; 4) an extensive reporting form was available for analysis. This reporting form was filled out as soon as possible after the incident by the anaesthetist involved. It included : data of the patient with appreciation of the risk (ASA Physical Status), type of surgery, detailed anaesthesia procedure from premedication to recovery and even post-operatively, detailed report of the incident and appreciation of the associated factors. In addition, the anaesthetist was asked to evaluate decision functions that could have played a role in the genesis of the problem situation. Labels for cognitive activities involved in decision making process were created using the well-known Rasmussen's model of decision making (ref. 4). In order to fit better the sequential structure of an anaesthesia case, three main phases of evaluation have been differentiated : preoperative anaesthetic risk evaluation, inadequate perioperative management and postoperative risk evaluation. Table 1 shows the proposed framework based on the Rasmussen's model being used in our reporting form.

Table 1 - Framework for Potential Cognitive Failures

Inappropriate preoperative anaesthetic risk evaluation/Failure to perceive information

Misdiagnosis

Inadequate perioperative risk management

Failure to follow established procedure

Inappropriate postoperative risk evaluation

Monthly "safety" conferences were organized in order to allow the anesthetist who experienced the situation to present the facts and to discuss them in the presence of the members of the Anesthesia-Resuscitation Department. These conferences played a determining role in the appreciation of the adequacy of reaction to the situation through the flow of communication between expert anesthetists and those in training.

 

Results

 

Tables 2 and 3 summarize the basic features of the problem situations and the critical decision making steps involved in the 30 problem situations collected.

Table 2 - Factors associated with Problem Situations

Associated Factors

n

inadequate experience

30

poor communication

26

lack of sleep, fatigue

24

inadequate supervision of juniors

16

productive pressure

16

unfamiliar with the operating room

13

anaesthetist/surgeon relations

14

presence of a distracter

10

inadequate monitoring

9

visual restriction

9

interruption

8

emergency

1

 

Table 3 - Distribution of the Critical Decision Functions to the Situation, according to degree of training of the anaesthetist involved. For one problem situation, there can be several critical stages; thus the number of responses is not equal to the number of problem situations analyzed.

 

2nd

3rd

4th

5th

senior

mixed

pre. eva.

2

3

1

2

1

2

detection

3

4

3

   

3

misdiag.

2

4

1

 

1

 

peri.man.

3

3

1

   

2

procedure

4

4

1

1

   

post. eva.

 

2

1

1

 

1

 

From a systemic approach of the different contributory factors identified in our study, four prototypical risk situations emerge which bring into play the types of error, the complexity of the system, the contextual factors (including the role of the team), the cognitive processes brought into play in the decision making and the degree of training.

 

The first prototypical risk situation concerns failures to perceive information during surgery and failures of procedure. It concerns mainly anaesthetists of 2nd, 3rd and 4th years of training.

Failure of perceive information are related to situations where either the anaesthetist did not detect some valuable information about the problem or detected it too late (for instance, the bleeding, the ischemia and the airway obstruction). These type of failure are the most frequently cited at the origin of the problem situation with severe outcome for the patient. Failures of procedure or execution appear in an unfamiliar environment. Lack of experience, poor communication, fatigue and inappropriate supervision are often mentioned by the anaesthetist as contributory factors to the problem situation.

 

The second prototypical risk situations concern difficult diagnosis process and treatment. Results show that mainly anaesthetists of 3rd year of training experienced diagnostic difficulties. In-depth analysis of the problem situation showed that anaesthetist doesn't ask for help, as the procedure suggests. Fatigue, poor communication, distractor and inappropriate supervision are the most frequently associated factors mentioned by the anaesthetists.

 

The third prototypical risk situations concerns mainly novice anaesthetists of 2nd and 3rd year of training. It concerns inappropriate preoperative and postoperative anaesthetic evaluation. It is the dynamics of the world and the adaptation to its which holds our attention here. Fatigue, poor communication but mainly lack of experience are the most frequently cited associated factors.

 

The last prototypical risk situations involve more experienced anaesthetists and concern also inappropriate preoperative and postoperative anaesthetic evaluations. The most frequently cited factors associated to these problems are fatigue and poor communication.

 

We have attempted to validate these patterns by the means of the LISREL (Estimation of Linear Structural Equation Systems by Maximum Likehood Methods) but the limited number of our database and the binary form of the variables did not allowed to develop mathematically the relational structures.

However, the correspondence analysis procedure allowed to describe the proximities existing between the type of failure and the degree of expertise, taking into account the difference in numbers (figure 1). Given the restrictive number of our data, it has mainly a descriptive value. The proximity of the three points, 2nd, 3rd and 4th year of training signifies a similarity of the profiles of failure types. If we look at the distances between these 3 points, we see that the types of failure that most concerned this sub-category of anaesthetists are the failures of preoperatory risk evaluation, of detection, of diagnosis (particularly close to the 3rd-year subjects), of procedure and of execution, while the 5th-year anaesthetists and the seniors are more concerned by preoperatory and postoperatory risk evaluation failures.

Figure 1 - The correspondence analysis -- Proximity between type of failure and degree of expertise

 

Conclusions

 

It is not our concern in this paper to discuss the results regarding the anaesthesia accident literature (see ref.14). We will focus on the systemic analytical perspective adopted in the study and discuss its interests in the view to escape from too superficial accident analyzes in medicine that emphasize the role played by one single factor, human error. In this discussion, we will also attempt to demonstrate that the prototypical risk situations that emerge from our accident data are not specific to one single domain but rather present characteristics that are shared by complex dynamic systems. Seen in this view, we believe that they can be cross-contextual.

 

We collected and analyzed 30 problem situations and described the contextual factors but also the critical decision functions that have played a role in the genesis of the problem situations.

 

We observed that the main critical decision function mentioned by the practitioners is : failure to perceive valuable information during surgery. The first configuration shows a link between this kind of difficulty and the fact that the process change not only as a direct consequence of the problem's solver's actions but also spontaneously, due to system's factors such as other agent's actions on the process (for instance surgeon's actions in our case). We can relate this configuration to problems of situation awareness that has been described as the origin of many accidents in other dynamic environments. In aeronautics, Sarter and Woods (ref. 15) have observed the difficulties experienced by pilots for constructing an uptodate representation of the state of the plane, notably concerning the status of the on-board automated systems. The information given out by these automated systems is often insufficient to allow the pilot a precise representation of the functioning (ref. 16). In the same way, the surgical draping limits the anaesthetist's access to visual information about the surgical procedure. If access to visual information is limited and if verbal exchange within the team is reduced, as we observed, the practitioner might not be informed of problems until critical signs appear on the displays. This can induce a delay in the diagnosis process and treatment. Collective work has become indispensable in modern working environments due to the development and complexity of techniques. It is the visibility of the behavior of all the agents (machine and/or human) involved in a task that is our concern here.

 

The second prototypical situation concerns problems of planification and anticipation. More and more studies describe planification and anticipation as a determining element for the recovery from problems in dynamic environment (refs. 17, 18). From the cognitive point of view, we can consider that the orientation of the attention before the appearance of a problem contributes to partially reducing the uncertainty of the worker regarding the place, the moment and the type of response he should make during the problem situation (ref. 19). It is easy to imagine the advantage of this benefit in dynamic system where time delays are crucial in avoiding permanent damage. In the domain of anaesthesia, this observation led several countries to put forth the preanaesthesia visit as an obligatory safety standard (ref.20). But making this visit mandatory is not a guarantee of safety. In most hospitals, preanaesthetic visits are often done late in the evening before the operation, under time pressure and even by the team on duty. In these conditions, the quantity of resources assigned to this visit, is perhaps not optimal. Bias as reticent rationality described by Reason (op.cit.) as the refusal of human cognition to engage in fastidious analytical reasoning under high workload pressure may shape the quality of the decision making process (see the 4th prototypical risk situation involving more experienced practitioners in our study). Multitasking is a common attribute of modern systems and human can easily be overwhelmed by work demands in such conditions.

 

Finally, the analysis of our data permit to link the type of cognitive difficulties encountered by the practitioner and his degree of expertise. This can improve our knowledge about the progressive development of skills and abilities required by the task in the context. We observed that mainly anaesthetists of 3rd year of training experienced diagnostic difficulties. At this intermediary period of learning, it has been shown from other studies (refs. 21, 22), that behaviour would be guided in a "descending" manner by more rigid structures of knowledge or rules, ignoring possible exceptions. This may leads to the development of "fixation errors" which have been widely recognized as a major source of human error in dynamic and complex environments (ref. 23).

 

To summarize, the table 4 presents the prototypical risk situations that emerge from the flow of accident data. These patterns emphasize the links between human performance, contextual factors, some attributes of complex work systems and the structuring of expertise. They emphasize the cognitive compromise firmly established in a complex context and the persistence of errors even in more experienced practitioners.

Table 4 - Characteristics of the four prototypical risk situations

Critical

Decisions Functions

Attributes of the system

Underlying cognitive mechanisms

Degree of training

failure of detection &

per manag.

collective work

dynamic

limited attention capacity

trainees

misdiagnosis

uncertainty

heuristic

fixation

3rd

failure of reevaluation

anticipation

dynamic

feedback

cope with time

novices

bad evaluation

multitasking

reticent rationality

more experts

 

In conclusion, this study emphasized the need to adopt a systemic approach to analyze accident in complex systems such as clinical systems. To do that, we complemented classical accident analyses, based on the external contributory factors, with a cognitive analysis, based on the decision functions involved in the problem situation. This type of research is critical to better understand the demands and the constraints pose by the systems on the practitioners and predict some system breakdowns in modern human-machine working environments. The results should lead to the development of better training programs and technology developments that actually improve safety. We believe that the concept of prototypical risk situation provide a mechanism for abstracting common generic patterns of system breakdowns. We proposed this study as an encouragement to further investigations in order to test the hypothesis of prototypical risk situations on a larger sample of accident data but also in other complex domains.

 

References

1. Amalberti, R. Safety in flight operations. In B. Wilpert, & T. Qvale (Eds), Reliability and safety in hazardous work systems : approaches to analysis and design. Hillsdale : Erlbaum, 1993.

2. Chopra, V., Bovill, J.G., Spierdiyk, J., & Koorneef, F. Reported significant observations during anaesthesia : a prospective analysis over an 18-month period. British Journal of Anaesthesia, 1992; 68: 13-17.

3. Hollnagel, E. Models of cognition : procedural prototypicals and contextual control. Le travail humain, 1993; 56, 1/1993: 27-57.

4. Rasmussen J. Analysis of the tasks, activities and work in the field and in laboratories. Le travail humain 1993; 56, 133-147.

5. Gaba, D., De Anda, A. The response of anaesthesia trainees to simulated critical incidents. Anesthesia and Analgesia, 1989; 68: 444-451.

6. Sanborn, K.V., Castro, J., Kuroda, M., Thys, D.M. Detection of intraoperative incidents by electronic scanning of computerized anesthesia records. Anesthesiology, 1996; 85, 977-87.

7. Cooper, B., Newbower, R.S., Long, C.D. Preventable anesthesia mishaps : a study of human factors. Anesthesiology, 1978; 49: 399-406.

8. Vourc'h, G. Enquête épidémiologique sur les anesthésies. Ann. Fr. Anesth. Réanim., 1983; 2: 333-385.

9. Woods, D.D., Johannesen, L.J., Cook. R.I., Sarter, N. B. Behind human error: Cognitive systems,computers and hindsight. Sate-of-the-art report for CSERIAC, Dayton, OH, 1994.

10. Reason, J. The identification of latent organizational failures in complex systems. In J.A.Wise, V.D. Hopkin & P. Stager (Eds), Verification and Validation of Complex Systems: Human Factors Issues. Berlin: Springer-Verlag, 1993.

11. Hollnagel, E. The phenotype of erroneous actions. In G.R. Weir & J.L. Alty (Eds). Human-Computer Interactions and Complex Systems. London: Academic Press, 1991.

12. Reason, J. Human error. Cambridge: Cambridge University Press, 1990.

13. Woods, D. Coping with complexity : the psychology oh human behavior in complex systems. In L.P. Goodstein, H.B. Anderson, & S.E. OLsen (Eds), Tasks, errors and mental models. New-York : Taylor & Francis, 1988.

14. Nyssen A.S. Vers une nouvelle approche de l'erreur humaine dans les systèmes complexes : exploration des mécanismes de production de l'erreur en anesthésie. Thèse de doctorat (unpublished), Université de Liège, Liège, Belgium 1997.

15. Sarter NB, Woods DD.Strong, silent, and out-of-the-loop : properties of advanced (cockpit) automation and their impact on human-automation interaction. CSEL Report-95-TR-01. Columbus, Ohio State University, 1995.

16. Norman DA. The problem with "automation": inappropriate feedback and interaction, not "over-automation". Philosophical Transaction of the Royal Society of London, 1990; B327: 585-593.

17. Xiao Y, Milgram P, Doyle D. Off-loading, prevention, and preparation : planning behaviors in complex system management. 25th Annual Conference of the Human factors Association of Canada.. Mississauga, Ontario : Human Factors Association of Canada, 1992.

18. Hoc, J.M. . Effets de l'expertise des opérateurs et de la complexité de la situation dans la conduite d'un processus continu à long délais de réponse : le haut fourneau. Le Travail Humain, 1991; 54: 225-249.

19. Camus J.F.La psychologie cognitive de l'attention. Paris : Armand Colin, 1996.

20. Belgian Anaesthesia Patient Safety Steering-Committee: Belgian standard for patient safety in anaesthesia. Acta Aanesth. Belg. ,1989; 40: 231-238.

21. Lesgold A. Acquiring Expertise. In: J. Anderson & SM. Kosslyn (Eds). Tutorials in Learning and Memory. New-York: Freeman and Company, 1984; 31-58.

22. Anderson JR. Acquisition of cognitive skills. Psychological Review, 1982; 89: 369-406.

23. De Keyser V, Woods DD. Fixation errors: failure to revise situation assessment in dynamic and risky systems. In: Colombo AG, Saiz de Bustamente A, eds. Systems Reliability Assessment. Brussels, B: ECSC, EEC, EAEC, 1990; 231-251.