This is a draft. Comments and criticisms very welcome.

Reasons For The Failure of CRM Training in Aviation

Chris Johnson

Department of Computing Science
University of Glasgow,

Glasgow, G12 8QQ, Scotland.
+44 141 330 6053




Crew Resource Management (CRM) techniques have been proposed as means of reducing the opportunity for human error in aviation. They are also intended to improve the detection and mitigation of those errors that do occur. There are, however, a number of limitations with existing approaches. For instance, the FAA has recognized the need to integrate CRM more closely with Standard Operating Procedures (SOPs). Recent incident reports have also argued that CRM problems continue to be major sources of aviation accidents. This paper, therefore, builds on the work of the FAA to identify further reasons for the apparent failure of many CRM training techniques.


Crew/cockpit resource management; training; safety.



Crew Resource Management (CRM) has been used to describe many different forms of training [1]. Initially, the focus was on the individual’s interaction with their colleagues in the cockpit. Crew members training included material on protocols and procedures that were intended to reduce ambiguity in cockpit communications [2]. Subsequently, CRM techniques began to focus on the wider issues associated with team building and with the effective sharing of tasks during "high-workload" situations [3]. The terminology began to reflect this distinction as "crew resource management" gradually replaced "cockpit resource management". These approaches were finally integrated with technical training through FAA initiatives, such as the Advanced Qualification Programme. A number of recommended practices were introduced to encourage mutual situation awareness, team-based decision making and workload management.

Unfortunately, the successful introduction of CRM training into many airlines has not been mirrored by any noticeable reduction in the number of incidents and accidents that stem from crew coordination and communication problems [4]. Two possible explanations have been offered for this apparent failure. The first, proposed by Foushee and Helmreich as early as 1988, is that CRM provides greatest support for crew performance under normal operation conditions [5]. Paradoxically, however, coordination is not necessary for most of the time that crews are in flight. Effective coordination is, however, critical for those rare emergency situations that impose extremes of high workload [3]. Most CRM programs have addressed these concerns by using high-fidelity simulators that enable crews to test the utility of their training in a direct manner. There have also been attempts to introduce crew selection procedures that are intended to identify individuals who have positive attitudes towards group activities.

The second approach that has been developed to enhance CRM rejects much of the previous argument. Rather than focussing on CRM as a means of improving performance under abnormal conditions, others have sought to ensure that it becomes a necessary component of many routine tasks. This approach has resulted in the FAA’s recent Advance Crew Resource Management (ACRM) Manual:

"Once ACRM is implemented, the crews are provided with focused opportunities to practice CRM procedures under normal, non-normal and training conditions. Crews, through the normal CRM procedures, are provided with the opportunity to practice specific CRM behaviours every time they fly. This frequent practice of learned behaviours promotes the development of CRM skills, skills that an airline has identified as essential to good performance within its operational environment"

Of course, these two approaches are not contradictory. ACRM also emphasizes the use of simulator training to reinforce CRM skills under abnormal circumstance. The difference lies in the emphasis that Seamster and others have placed upon the use of CRM techniques in nominal operating conditions [1].

It is too early to judge whether or not the FAA’s ACRM techniques will have a significant impact upon crew performance in both nominal and extreme operating conditions. However, the basic components of CRM training are not radically different from those that were first proposed in the 1980’s. For example, the same simulation techniques have been widely used to support CRM for more than a decade. The perceived success of such training tools has reached the stage where they are now a pre-requisite for public transport operators to be granted their UK Aircraft Operators Certificate. UK Aeronautical Information Circular 143/1993 states that all crew must have completed an approved CRM course before January 1995. Subjects to be incorporated in such a training course include: Standard Operating Procedures, the Flight Deck Social Structure and a detailed examination of the manner in which CRM can be employed in order to make a positive contribution to flight deck operations. JAR OPS sub-part N, 1.945(a)(10) and 1.955(b)(6) and 1.965(e) extended similar requirements to all signatory states during 1998. It, therefore, seems appropriate to conduct an initial investigation into the effectiveness of these techniques as a means of reducing aviation incidents and accidents.

Such an analysis introduces a number of methodological concerns. CRM failures are much more visible than the successful instances of flight deck coordination that help to avoid an adverse outcome. However, by expanding the scope of our analysis to include FAA/NASA’s Aviation Safety Reporting System (ASRS) and the UK Confidential Human Factors Incident Reporting Programme (CHIRP), it is possible to glimpse situations in which CRM training helps crews to detect and mitigate opportunities for failure. It can also be argued that by studying those accidents that do occur, it is possible to further enhance CRM training techniques so that they address the real failures of existing approaches rather than those that are identified through anecdotal evidence.



There are numerous examples of incidents and accidents that occur when CRM breaks down. The ASRS DirectLine bulletins provide one of the best sources of information on CRM failure. These reports analyze several different incident reports that address common failures:

"…the following report excerpts indicate a lack of CRM, and a resultant failure to maintain an adequate division of labor among the cockpit crew. In the first report, numerous distractions inside and outside the cockpit, combined with an apparently uncompleted checklist, led to a relatively minor altitude deviation:

"Reported weather was thunderstorms and hail. We were on a heading and altitude...that kept us parallel to a line of thunderstorms. After level-off at FL290, [the Center Controller] called us 500 feet high. In all the confusion...we neglected to reset altimeters at FL180. The problem arose...during a high workload period of time, a period of moderate turbulence, lightning nearby, working with airborne radar to determine our safest flight path, and communicating constantly with the Controller." (#107888)

The next reporter likewise experienced high workload and multiple distractions, including a minor mechanical malfunction.

"Descending through approximately 23,000 feet and while navigating an area of precipitation and thunderstorms, both air conditioning packs failed. we worked on the pressurization problem...we were assigned 11,000 feet. As we leveled, ATC asked our altitude because he saw us at approximately 10,500 feet. Then we noticed that two of our altimeters were still set at 29.92 with the [actual] pressure at 29.42. Our workload was obviously heavy, but we should not have missed this basic procedure." (#265215)

Again, appropriate division of cockpit tasks (one pilot to fly the aircraft, the other to handle the malfunction), and adherence to procedure (the checklist) probably would have caught this mistake before ATC did. At the very worst, left unnoticed, this incident had the makings of a repeat of other distraction-related accidents."

(ASRS DirectLine, Issue No. 9, March 1997).

It is worth noting that given current certification requirements, the crews involved in these incidents should have completed some form of CRM training. The previous citation illustrates how inadequate workload management and poor situation awareness, as a result of multiple distractions, can contribute to adverse incidents. There are other ways in which poor CRM jeopardizes safety. In particular, there is a growing concern to look at the impact that poor CRM has upon other personnel in the aviation industry. In the following example, poor CRM had direct implications for ground staff as well as the flight crew:

"Some reporters continued with an operation even when something didn't look right, or was blatantly wrong. Flight crews also admitted to failing to request a tug to get into, or out of, a tight parking place. The latter two problems may have been responses to schedule pressure or to demand for on-time performance, also mentioned by many flight crew members as an underlying cause of incidents. These and other sources of distraction also caused a marked reduction of cockpit coordination and CRM skills. A plane's rear airstairs received damage when the crew became distracted by multiple demands, and failed to act as a team:

"[This incident was caused by] distractions in the cockpit, plus a desire to operate on schedule. There were several conversations going on from inside and outside the aircraft. Raising the airstairs is a checklist item...backup is another checklist item which requires the Second Officer to check a warning light. No one noticed the light. The pushback crew consisted of 2 wing observers plus the individual in the tug...all failed to observe the rear stairs." (#264692)

(ASRS DirectLine, Issue No. 8, June 1996).

Such incidents are common in the publications and in the data sets that are provided by incident reporting systems such as the ASRS and CHIRP. They are also well documented in the human factors literature [3]. Together these reports serve to emphasize the contribution that CRM training makes to safe operation. However, there is also a sense in which the relatively high frequency of incidents caused by poor CRM also indicate the failure of existing CRM training techniques.



It is surprising how few positive instances of CRM training are cited in the publications of national and international incident reporting schemes. As part of this work, the author was only able to find two explicit examples out of more than four hundred incidents in the sixty most recent editions of the ASRS’ Callback and DirectLine. This raises significant concerns. It is important to reinforce the positive effects of appropriate behavior, not just the negative consequences of violation [6]. The following citation provides an example of these beneficial effects of CRM.

"Great CRM and Piloting

…The Captain’s autopilot dropped off with several warning flags on his flight instruments. He transferred control of the aircraft to me. During descent, various warning lights illuminated, which were reset several times. We ended up with one pitch trim working. The Captain was surrounded by inop flags on his instrument panel, so was unsure of which instruments were still operating. Random electrical warnings erroneously indicated that the aircraft was simultaneously on the ground and in the air… The Captain and I had donned oxygen masks as soon as we detected smoke. The Captain had a partial com. failure with his oxygen mask, then with his headset/boom mike. Cabin pressurization was climbing. Cabin pressurization control was switched to standby mode. The SO found a second fire extinguisher and discharged it into the continuing red glow in the circuit breaker panel. During the approach, we encountered... failure of both direct lift control auto spoilers. At touchdown, spoilers were manually extended. I selected reverse thrust, but no thrust reversers worked. On taxi in, all three engines were in flight idle. At the gate...the aircraft was still pressurized–Flight Attendants could not open the door. The SO tried to shut down all packs and engine bleeds, but could not. The Captain attempted to shut down the engines with fuel and ignition switches, but engines kept running. Engine fire [fuel shutoff] handles were pulled, and engines shut down. The door was opened from the outside, and the passengers exited.

The final diagnosis from maintenance personnel: an improperly installed wiring clamp had worn through the insulation and shorted out. Kudos to the flight crew for great crew coordination and superb handling of this aircraft emergency."

(ASRS Callback, Issue No. 239, May 1999)

As mentioned, this example is unusual because it is explicitly indicated as a positive instance of "great CRM and piloting". This is an interesting distinction. It perhaps explains the FAA’s interest in developing ACRM techniques that view CRM as an integral part of piloting skills. In general, however, the lack of positive examples of CRM in ASRS and CHIRP publications forces analysts to look at source data sets or to make inferences about the role of CRM in those incidents that are described in DirectLine and Callback. In both cases, there is often insufficient evidence for us to make detailed claims about the benefits of CRM. For instance, without access to the reporting crews we cannot determine whether individuals had recently undergone CRM training. It is only possible to associate general patterns of behavior with positive outcomes. This is illustrated by the following report

"This was my leg and we were departing Runway 22L with the SID. I briefed the departure—an immediate left [turn] to 190 degrees with a right [turn] at 2.3 DME to 220 degrees, climb to 2,500 feet (ATC restriction). Flying the departure, ATC issued a left [turn] to 230 degrees as we crossed the 2.3 DME fix, climb to 6,000 feet and a frequency change. The next controller, who was very busy, issued a "tight turn to 040 degrees," which I mistakenly assumed to be a left turn. Starting the turn, ATC commented, "need a nice tight turn…" which the Captain responded to affirmatively. Then ATC came back, "just wanted to confirm a right turn." We complied immediately. Looking back, I should have requested clarification on direction of turn. ATC never issues a "tight turn," always a direction of turn—"right turn" is what he must have said. In this situation, I knew there was a parallel departure off Runway 22R, and at the time the 040 degrees turn heading was issued, left was the closest direction. Also, I was too eager to comply instantly in a very busy environment with rapidly issued clearances. Next time I will...verify any ATC clearance that seems vague or non-standard, especially one as critical as direction of turn that close to the airport."

(ASRS Callback, No 242 August 1999)

At first sight, this incident how poor CRM creates the circumstances for a potential accident. Neither the first officer nor the Captain sought to clarify the "tight turn" request from the Air Traffic Controller. Common agreement between both members of the crew should have reduced the opportunity for such errors. As the contributor notes "ATC never issues a "tight turn," always a direction of turn—"right turn" is what he must have said". However, it can also be argued that the previous incident provides a successful example of the wider application of workload distribution in CRM. Even thought the ATC officer was working under pressure, they still managed to detect the potential mistake and encourage sufficient review for the crew to avert a potential accident.

The lack of positive information about CRM in incident reporting schemes is also apparent in CHIRP. No articles explicitly illustrating the positive effects of CRM were found in the last three years of Feedback. This might be due to the fact that existing reporting forms do not explicitly prompt for the information about the positive role of previous training [7]. The key point here is that outside of simulator studies, we actually know very little about the detailed beneficial effects of CRM training [8].


The previous paragraphs have argued that most studies into CRM have focussed on finding the situations and circumstances in which its failure contributes to an incident. In contrast, this paper has argued that it can be difficult to find direct operational support for the benefits of CRM training as a means of mitigating incidents and accidents. There is also a growing body of evidence that illustrates the consequences that arise when CRM training techniques fail to deliver their intended benefits. For instance, the following citation is taken from the NTSB report into in-flight fire and emergency landing of Federal Express flight 1406. All of the crew had successfully graduated from approved CRM courses. The Captain had received recurrent CRM and simulator training approximately six months before this accident. The flight engineer had received CRM training five months before the accident:

"After the accident, the captain said that he had allowed the first officer to continue flying the airplane during the emergency so that he could coordinate with ATC and work with the flight engineer on completing the checklists. This should have resulted in an effective apportionment of the workload among the three crewmembers, in that the flying pilot would not have been overly distracted from flying the airplane, the flight engineer would have received needed assistance with his duties, and the captain would have had the opportunity to oversee the actions of both. However, the Safety Board is concerned that, despite the captain’s stated intention to serve in a monitoring and coordinating role, he failed to provide sufficient oversight and assistance to ensure completion of all necessary tasks.

The captain did not call for any checklists to address the smoke emergency, which was contrary to FedEx procedures. (The flight engineer initiated the "Fire & Smoke" and "Cabin Cargo Smoke Light Illuminated" checklists.) Nor did he explicitly assign specific duties to each of the crewmembers. The captain also did not recognize the flight engineer’s failure to accomplish required checklist items, provide the flight engineer with effective assistance, or intervene to adjust or prioritize his workload. In fact, the captain repeatedly interrupted the flight engineer during his attempts to complete the "Fire & Smoke" checklist, thereby distracting him further from those duties."

(NTSB/AAR-98/03, page 68)

The Federal Express accident is only one of several that have been caused or exacerbated through CRM problems even though all of the personnel involved had recently graduated from approved CRM training courses. The recent UK AAIB report into the Puerto Plata crash provides an extreme example of this:

"Throughout the approaches at Puerto Plata the commander deviated persistently from the SOPs and company regulations often without any prior consultation with the FO. Moreover, there were instances where the operation of the aircraft was unusual or where individual parameters far exceeded those experienced in normal operations. However, the FO confirmed that he only questioned the commander's actions or intentions on one occasion when he asked which runway he was positioning for as he entered the visual approach. These high workload situations are exactly the areas where sudden incapacitation of the flying pilot would endanger the safety of the aircraft and it is therefore imperative that whenever any such deviations are observed the non flying pilot must draw attention to them and satisfy himself that the other pilot is not suffering from a subtle, incipient or sudden incapacity. The FO made no such contribution during these approaches but seemed content to rely on his own assessment of the commander as being someone in whom he had complete confidence both in terms of judgement and ability. It seems probable that it was the automatic height call out at 50 feet that provided the impetus for the FO to make a purposeful and independent intervention when he called "Go-around".

Neither the commander nor the FO demonstrated the most basic principles of CRM during these approaches. The commander had attended a two day CRM course in 1995 and his subsequent recurrent CRM training had been conducted in the simulator as allowed by the CAA. The FO had attended an approved CRM course just two weeks prior to the accident but had apparently not yet assimilated much of that training. The operator now has CAA approval for a CRM course designed to its own requirements. This training will be conducted in a new training facility when it is commissioned in 1999.

(AAIB Air Accident Report No 3/99).

This incident emphasizes the problems that arise when CRM training fails to have its intended benefits. The First Officer had attended an approved CRM course just two weeks prior to the accident. The AAIB report explains the limited impact of this course because he "had apparently not yet assimilated much of that training". This requires further explanation. Individual factors may account for a single pilot failing to demonstrate any practical application of CRM techniques after successfully graduating from an approved course. The UK CAA’s Aeronautical Information Circular (AIC) 143/1993, issued on 23 September 1993 explicitly states that CRM training is "not a quick fix that can be implemented overnight" and is not "a scheme that occurs independently of other on-going training activities". However, the recurrence of CRM failures justifies a deeper examination of the reasons why such incidents continue even for those individuals who have attended several CRM training courses and who have performed well in simulated emergencies.



There are a number of reasons why CRM training can fail to deliver the benefits that have been identified by its supporters. Perhaps the most important of these is that CRM is only one of several complex factors that affect human performance throughout the aviation industry. In particular, factors such as high levels of fatigue can mitigate against even the most effective forms of CRM training. This is a common theme in many ASRS and CHIRP reports.

"Fatigue and CRM

A high-workload phase of flight, frequency congestion, heavy traffic, and fatigue sometimes combine with less than optimum cockpit resource management to push pilots and controllers to their limits. When non-standard phraseology enters the picture, things can quickly fall apart as they did in this airborne conflict near Denver.

"The Controller was very busy, on the verge of overload...The Controller, with no warning or explanation called, '[Air Carrier X], the traffic you're following is turning final for Runway 26, a company [jet].' We looked at our 3 o'clock position and saw a [jet] inbound for the runway. My F/O, without asking me, called the traffic in sight [to ATC]...Just prior to our turn to final the Controller called with a frantic, 'You followed the wrong aircraft, turn right heading 270 degrees and climb to 5,000 feet'...I feel this was caused by improper phraseology and procedures, heavy traffic, crew fatigue, 12th leg in 27 hours, and a breakdown in cockpit communications." (# 248002)"

(ASRS DirectLine, Issue No. 7 : September 1995).

The previous example illustrates one of the underlying concerns that motivate this paper. CRM problems are often identified in incidents that also involve more systemic issues, such as high levels of ATC workload or flight deck fatigue. It can, therefore, be argued that CRM is symptomatic of these deeper failures. There is also a concern that "poor CRM" is being used to label many different forms of human problems during incidents and accidents. There has been a move away from blaming individual instances of human error as a precursor to many failures. Instead, all forms of communication, coordination and decision-making problems are regarded as CRM issues. This is important because it helps to obscure situations in which crew performance was primarily impaired by problems imposed by their environment, such as high ATC workloads, or by operational factors such as high levels of fatigue.

There are further reasons why CRM training can fail to prevent accident and incidents in the aviation industry. In particular, there is little evidence that periodic CRM training can affect the social and cultural norms that have a profound impact upon the operational performance of crewmembers. This is illustrated by numerous reports from the ASRS and CHIRP collections.

"In several accident/incident investigations the "Flight Deck Gradient", a term used to describe the reluctance of a less experienced crew member to question a decision/action of a Captain, has been cited as a contributory cause. Recognition of this potentially hazardous effect is often included as an aspect of CRM training, but the problem can be extremely complex, particularly if combined with an apparent short-term incapacitation. In such circumstances, it is often difficult for the junior crew member to intercede.

It was the Captain's leg. He is an experienced pilot, capable and well liked and in no way overbearing. On short finals to Rwy 30 at ####, after a good, stabilised visual circuit and approach, the aircraft begins to descend below the VASI indications, giving finally four reds. As the runway has a displaced threshold and the obstacle was now behind us I make no comment, as I presume the descent (below the correct glide-path) is intentional to facilitate an early touch-down point. (The runway is relatively short for our type of aircraft).

The Captain now sees the VASI indications, says so, and applies power. I call "Rad Alt 50", "30" and "20" but we don't land. I inform the Captain we are floating and to put the aircraft on the ground. He seems surprised by my call, but removed power and lands. However, we are between 1/3 to 1/2 of the way down the runway. The Captain appears transfixed by the runway and hasn't engaged reversers as per SOP. I call for reversers and query the autobrake setting of level three out of five available levels. He makes no response although he is not obviously unwell. I state that I am increasing autobrake to level four. He doesn't acknowledge. As speed reduces he finally deploys the reversers, but as our Normal Operations SOP, only at idle thrust. We stop with approximately 200ft runway remaining. On taxi back he states he had difficulty reading the VASI and no other discussion occurs.

With hindsight I allowed my attitude of respect and friendliness toward the Captain to influence my actions. I was insufficiently assertive once the incident was in progress and prior to the incident I presumed rather than checked the reasons for his flight profile.

(CHIRP Feedback No. 46 April 1998)

This incident illustrates how CRM failures often stem from complex social circumstances. Of course, many CRM courses explicitly include exercises that are intended to help crewmembers deal with problems such as the "Fight Deck Gradient". However, as the CHIRP editor argues, this "problem can be extremely complex, particularly if combined with an apparent short-term incapacitation". Some of this complexity stems from the impact of "alienation" during simulator studies. This occurs when individuals and team members unknowingly alter their behavior during simulator exercises. The social context of these training sessions reduces the impact that problems, such as the Flight Deck Gradient, have upon crew performance. Participation in these activities alters the social norms that often contribute to incidents and accidents.



Crew Resource Management (CRM) techniques have been proposed as means of reducing the opportunity for human error in aviation. They are also intended to improve the detection and mitigation of those errors that do occur. There are, however, a number of limitations with existing approaches. For instance, it has been argued that tutorials about communication failures do not alter team behavior under abnormal conditions. This has resulted in the increasing use of simulators in CRM training. Conversely, other analysts have identified a lack of integration between CRM and Standard Operating Procedure. This has led the FAA to develop revised ASRM techniques. Much of this work has been based on an analysis of accident and incidents in which poor CRM has been identified as a primary or contributory factor.

In contrast, this paper has attempted to use national aviation reporting systems to identify successful instances in which CRM techniques helped to either detect or resolve particular failures. Although there are positive examples of CRM within the ASRS collection, very few instances of positive CRM are published in either Feedback or Callback. There are, however, many published reports that point to the problems caused by poor CRM rather than the incidents that have been avoided by good CRM practices.

This analysis leads to wider questions about whether individuals who have received CRM training actually behave differently in either everyday operation or under extreme circumstances. There is limited data about the benefits of CRM in nominal situations. However, there are a growing number of reports, such as the AAIB account of the Puerto Plata accident and the NTSB report into FedEx 1406, in which relatively recent participation in CRM training had no observable effect on crew performance.

The closing sections of this paper have, therefore, argued that wider systemic factors, such as crew fatigue and the "flight deck gradient" of authority, have repeatedly undermined even the most recent attempts to instill good CRM practice within the aviation industry. The consequence of this analysis is that it may be time to look beyond CRM training as a panacea for human factors problems. It may be time to look at the underlying causes of high workload, of distraction and of poor decision making rather than continuing to advocate the coping strategies that are instilled by CRM training.



Thanks are due to the members of the Glasgow Accident Analysis Group and to the Glasgow Interactive Systems Group.



  1. T. Seamster, Automation and Advanced Crew Resource Management. In S. Dekker and E. Hollnagel (eds.) Coping with Computer in the Cockpit, Ashgate, Brookfield, USA, 1999.
  2. R.L. Helmreich, A.C. Merritt and J.A. Wilhelm, The Evolution of Crew Resource Management in Commercial Aviation. The International Journal of Aviation Psychology, 9:19-32. 1999.
  3. C.A. Bowers, E.L. Bickensderfer and B.B. Morgan, Air Traffic Control Specialist Team Coordination. In M.W. Smolensky and E.S. Stein (eds.) Human Factors in Air Traffic Control. Academic Press, San Diego, USA, 1998.
  4. C.W. Johnson, Why Human Error Analysis Fails to Support Systems Development, Interacting With Computers, (11)5:517-524, 1999.
  5. H. Foushee and R.L. Helmreich, Group Interaction and Flight Crew Performance. In E.L. Wiener and D.C. Nagel 9eds.) Human Factors in Aviation. Academic Press, San Diego, USA, 1988.
  6. J. Rasmussen, A.M. Pejtersen and L.P. Goodstein, Cognitive Systens Engineering. J. Wiley and Sons, New York, USA, 1994. Chapter 6; "At the Periphery of Effective Coupling: Human Error".
  7. C.W. Johnson, Designing Forms To Support The Elicitation Of Information About Incidents Involving Human Error. Submitted to 19th European Conference on Human Decision Making and Control, Ispra, Italy. 2000.
  8. T.L. Seamster, J.R. Cannon, J. Purcell, R.M. Pierce, R. Fischer and R. Redding, Analysis of En Route Air Traffic Controller Team Communication and CRM. In Proc. Of the 36th Annual meeting of the Human Factors Society, Santa Monica, USA, 1992.