This is a draft - comments and criticisms very welcome.

The Limitations of Aviation Incident Reporting

Chris Johnson

Department of Computing Science
University of Glasgow,

Glasgow, G12 8QQ, Scotland.
+44 141 330 6053
johnson@dcs.gla.ac.uk

www.dcs.gla.ac.uk/~johnson

 

 

ABSTRACT

Incident reporting schemes have been proposed as primary means of ensuring that failures in existing systems are not propagated into future applications. The success of the FAA/NASA’s Aviation Safety Reporting System (ASRS) and of the independent Confidential Human Factors Incident Reporting Programme (CHIRP) has led to the development of similar schemes throughout the world. While these initiatives have made significant contributions to aviation safety, this paper looks for further ways in which we might improve the reporting of human error.

Keywords

Incident reporting; human "error"; safety.

INTRODUCTION

Incident reports are primary mechanisms for improving safety in many different industries. For instance, the Chemical Safety and Hazards Investigation Board operates a scheme that covers all users and producers of chemicals within the United States. A national reporting system for the UK railway network has been proposed in the aftermath of the Paddington/Ladbroke Grove crash. Similar schemes also operate at local and regional hospitals throughout Europe. Many of these initiatives stem from the perceived success of incident reporting in aviation [1]. In particular, the FAA/NASA’s Aviation Safety Reporting System (ASRS) and the UK’s Confidential Human Factors Incident Reporting Programme (CHIRP) are often cited as prototypes for the wider application of confidential and anonymous reporting systems [2].

There are many different factors that contribute to the perceived success of incident reporting systems within aviation. For instance, they can be used to pick up failures that might eventually lead to more serious consequences had they not been detected in time. The findings of national and regional schemes can be exchanged so that trends can be identified and lessons can be shared across national borders. They also help to keep staff "in the loop". Most reporting systems include mechanisms by which the findings of previous submissions can be fed back to other workers. This helps to raise awareness about the potential for failure. Such feedback also reminds operators about possible detection factors and about the safeguards that can be deployed to mitigate any future risks [2].

A number of well-known problems affect the utility of incident reporting schemes as means of addressing the causes of human error. For instance, there is a tendency for some schemes to rely on reminder statements that encourage individuals and teams to "be careful" or to "work harder" on aspects of their performance [3]. Incident reports result in reminder statements to staff rather than requests for manufacturers to improve the design of their devices. Safety concerns end up as appeals to improve human performance rather than as management initiatives to avoid the precursors of operator "error". This paper shows how some of these problems can affect aviation reporting schemes. Later sections go on to identify a range of techniques that have been used to address these concerns within the ASRS and CHIRP systems.

AN ANATOMY OF AVIATION REPORTING SCHEMES

Both the ASRS and CHIRP are well established. The ASRS was set up in 1976 and now receives an average of more than 2,600 reports per month. It encourages submissions about incidents with a wide range of technical, organizational and meteorological causes. In contrast, the much smaller scale CHIRP is specifically focussed on human factors issues. It was established in 1982 and in 1998-99 received a total of 228 reports. These two schemes, therefore, differ both in their scale and in their intent. There are also some notable similarities. Both the ASRS and CHIRP rely upon individuals returning paper-based forms. Figure 1 illustrates an Air Traffic Control report from the CHIRP systems. There are further similarities. Both schemes are confidential rather than anonymous. Any necessary information, which cannot be obtained directly from the form, can be elicited during follow-up interviews. Any identifying features in the report are then removed prior to further analysis.

Figure 1 CHIRP Air Traffic Control Forms (Feb. 2000).

Figure 2 illustrates an excerpt from an ASRS air traffic management incident. An increase in traffic led a controller to request an increase in separation for aircraft entering his sector. This was not achieved and there was a loss of separation. This is typical of reports that describe how environmental factors, such as increased traffic, create the preconditions for aviation incidents.

ACCESSION NUMBER: 426859

DATE OF OCCURRENCE: 9901

NARRATIVE: I WAS WORKING THE LOW ALT (0-FL230) WINDSOR SECTOR AT ZOB. THIS SECTOR IS AN OUTBOUND SECTOR FOR DETROIT APCH CTL. THERE WAS EXCESSIVE TURB IN THE HIGH ALT SECTOR OVER MY AIRSPACE SO THERE WERE SEVERAL OVERFLT ACFT IN MY SECTOR THAT NORMALLY WOULD NOT BE THERE. ALSO, ADDITIONAL ACFT NOT ON PREFERRED ROUTING IMPACTED COMPLEXITY. TFC WAS BUILDING AND SECTOR BECAME EXTREMELY BUSY. DETROIT WAS SUPPOSED TO PROVIDE 15 MI IN TRAIL OVER 2 FIXES INTO MY SECTOR. THE AIRPLANES BEGAN TO ENTER AT 10 MI IN TRAIL, THEN QUICKLY BEGAN TO COMPRESS. I CALLED DTW TO ASK FOR MORE SPACING. AT THAT POINT MY SUPVR TRIED TO PUT A TRACKER IN TO HELP. DUE TO CONFUSION OVER VSCS PROC, I LOST COM WITH ACFT 3 OR 4 TIMES. MY D-SIDE WAS STRUGGLING TO PUT IN REVISED ROUTING FROM TMU AND COULD NOT PROVIDE ASSISTANCE. DETROIT HANDED ME A DC9 (ACFT Y) FOLLOWED BY THE A320 (ACFT X) WITH A 70-90 KT OVERTAKE WITH ONLY 5 MI SEPARATION. NEEDLESS TO SAY, 5 MI SEPARATION WAS LOST QUICKLY AT 13000 FT. LOA STATES THAT DTW SHALL PROVIDE 5 MI INCREASING BUT TODAY THEY WERE SUPPOSED TO PROVIDE 15 MI INCREASING. THE TRACKER NEVER MADE IT IN. I WAS DISTR BY ALL OF THE CONFUSION. I'M NOT SURE WHO ACCEPTED THE HDOFS. I ISSUED NO CTL INSTRUCTION TO EITHER ACFT. SEPARATION WAS LOST IMMEDIATELY UNTIL I CLBED THE LEAD ACFT.

Figure 2: ASRS Air Traffic Control Incident (Aug., 1999)

Both the ASRS and CHIRP publish newsletters that help to keep crews, air traffic controllers and engineers informed about incidents such as that illustrated in Figure 2. The CHIRP Feedback newsletter has a circulation of approximately 30,000. The ASRS’s Callback is distributed to 85,000 aviation professionals. Both are written in an informal and accessible style. Both provide an overview of the incidents that have been reported in previous months. They also focus on specific "critical" topics, such as ramp safety. The ASRS also publishes the DirectLine journal, which is intended for operators and flight crews of commercial carriers and corporate fleets.

SHORT-TERM FIXES?

The ASRS and CHIRP are amongst the most successful incident reporting schemes currently being operated in the aviation industry. Callback received Flight Safety Foundation publication awards in 1981 and 1987. The Aviation/Space Writers Association's (AWA) Award of Excellence was awarded to its editors in 1982 and 1992. In 1999, 99% of the 28,384 people that responded to a recent Feedback survey stated that "CHIRP makes a useful contribution to the promotion and improvement of flight safety". Only 0.6 % disagreed with this statement. It is, however, important to look beyond these figures and awards to ask see what lessons can be learnt from the success of the ASRS and CHIRP. It is also important to identify any ways in which these existing schemes might be improved. For example, the following excerpt describes an incident and proposes a solution:

"Problem: on landing, gear was unlocked but up. Contributing factors: busy cockpit. [I] did not notice the gear down-and-locked light was not on.

Discovered: Gear up was discovered on landing.

Corrective action: [I] was unable to hear gear warning horn because of new noise canceling headsets. I recommend removal of one ear-piece in landing phase of flight to audible warning devices to be heard by pilot. The noise-canceling headsets were tested by three people on the ground and all three noted that with the headsets active that the gear warning horn was completely masked by the headsets."

(ASRS Callback, Issue 247, January 2000).

The previous citation illustrates the strengths and weaknesses of many incident report schemes. They provide first-hand insights into usability problems that affect pilots, air traffic controllers and engineers. They also provide common sense solutions that help users to combat the challenges that poorly designed equipment can create. However, there is also a danger that immediate remedies to individual incidents will fail to address the root cause of a problem [4]. The noise-correcting headphones were clearly not fit for purpose. The recommended "solution" of removing one headphone whilst providing a short-term fix for individual pilots does little to increase confidence that such problems may not recur in future products. The previous excerpt, therefore, illustrates the first potential problem for incident reporting schemes. By focussing on the immediate symptoms of a problem, they may obscure deeper usability issues. Many reporting systems address this weakness by publishing more sustained investigations into several incidents. For instance, the ASRS does this through their DirectLine publication. There are, however, a number of further concerns about the treatment of human error in different reporting systems.

CONSISTENT SOLUTIONS?

An important benefit of incident reporting systems is that they enable lessons that have been learnt in one organization to be transferred to other companies in the same industry [5]. Similarly, lessons that have been learnt in commercial operations can be propagated to general aviation. In practice, however, it can be difficult to ensure that lessons are transferred in this manner. A particular problem is that many publications tend to respond to incidents in a piecemeal way. Similar incidents can provoke different responses in different schemes. This is particularly significant because many pilots read and contribute to incident reporting schemes in more than one country. For instance, the following Feedback excerpt offers a slightly different perspective on the problems that are posed by ambient noise:

"Fortunately, I have no incident to report. I would like, however, to highlight a common practice by some airlines, including my employer, which I feel is a significant risk to flight safety: namely the practice of not using flight deck intercom systems in favour of half wearing a headset over one ear for VHF comms, whilst using the other ear, unaided, for cockpit communications. And all this in what are often not so quiet flight decks.

I cannot believe that we do not hear much better with two ears than with one, and many are the times when I, and other colleagues of mine, have had to ask for the other crew member to repeat things because of aircraft noise in one ear, and ATC in the other with the volume turned high enough not to miss a call. Not the best answer in a busy terminal area after a long flight, and an unnecessary increase in stress factors. Myself and others have raised this point several times to our training and safety departments, all of which has fallen, pardon the pun, onto deaf ears. The stock answer is that there is no written down SOP on intercoms, and common agreed practice rules. In reality, the guy in the right hand seat has no influence without things getting silly.

As even single ear-piece headsets are not incompatible with intercoms, I would have thought a compromise would be mandatory use of full headset and intercom at the busy times, say below a given flight level, with the option for personal preferences in the cruise. Volumes for different communication channels could be adjusted to suit, and surrounding noise significantly reduced. This would preclude the need to speak louder than usual to be heard, to ask for repetitions, and generally improve the working environment. After all, if the CAA and other agencies have made intercoms mandatory in transport aircraft, it will be for a reason.

CHIRP Comment: The use of headsets for the purpose of effective reception of RTF/intercom messages between flight crewmembers is not mandated. The certification requirement for an intercom system is to provide communication between all crewmembers in an emergency. The partial/full use of a headset in normal operations should be dependent on the ambient noise level on the flight deck. For this reason, some operators specify the headset policy by aircraft type and phase of flight, as the reporter suggests."

(CHIRP Feedback, Issue No: 51 July 1999)

This illustrates how the individual incident reports that are received by different national schemes offer different perspectives on a common problem. The ASRS Callback article suggests that only one headset should be used during landing. In contrast, the CHIRP reporter argues that this practice poses a significant risk to flight safety. The feedback associated with the second report resolves this contradiction by arguing that the partial/full use of headsets should be dependent on ambient noise. However, this insight would not be readily apparent to many of the national crews who only have access to the first publication. This, therefore, illustrates a second concern about aviation incident reporting systems. Different national systems have offered different forms of advice to crews and engineers in response to very similar incidents [6]. These differences do not reflect underlying differences in SOP’s or in the regulatory environments. Instead, they are symptomatic of the inconsistencies that can arise when immediate or "stop-gap" advice is provided in response to individual human factors reports.

REITERATING WELL-KNOWN SOLUTIONS?

Previous paragraphs have argued that in a justifiable desire to provide rapid feedback, many incident-reporting systems focus on short-term solutions rather than the underlying causes of usability problems. It has also been argued that this can lead to inconsistencies between major incident reporting systems. This inconsistency prevents individuals from receiving a coherent message about common safety problems in an international industry. The following excerpt from the CHIRP Feedback publication illustrates a further concern.

"On pre-flight check I loaded the Flight Management Computer (FMC), with longitude WEST instead of EAST. Somehow the FMC accepted it (it should have refused it three times). During taxi I noticed that something was wrong, as I could not see the initial route and runway on the navigation map display, but I got distracted by ATC. After we were airborne, the senior cabin attendant came to the flight deck to tell us the cabin monitor (which shows the route on a screen to passengers) showed us in the Canaries instead of the Western Mediterranean! We continued the flight on raw data only to find out that the Heading was wrong by about 30-40°. With a ceiling of 1,000 ft at our destination I could not wait to be on "terra firma". Now I always check the Latitude/Longitude three times on initialisation!

A simple but effective safeguard against 'finger trouble' of the type described is for the pilot who does not enter the data to confirm that the information that he/she sees displayed is that which he/she would expect. Then, and only then, should the ‘Execute' function button be pressed." (CHIRP Feedback, Issue 53, January 2000)

This documents "yet another" cockpit data entry problem [7]. The corrective advice has the same common sense motivation as the ASRS report’s advice about noise reducing headsets. It is well founded and sensible. However, it reiterates ideas that have formed a standard part of CRM for almost twenty years. It is depressing that such data entry problems continue to be reported so long after regulatory authorities have acted to implement CRM training. UK Aeronautical Information Circular (AIC) 143/1993 (Pink) states that all crew must have completed an approved CRM course before January 1995. JAR OPS sub-part N, 1.945(a)(10) and 1.955(b)(6) and 1.965(e) extended similar requirements to all signatory states during 1998. The previous excerpt, therefore, illustrates a third problem for incident reporting schemes. Without further support and analysis, there is a danger that the pro-active role of incident reporting systems will be lost and that these schemes will continue to recommend well-known solutions to well-known problems. An alternative interpretation of the previous CHIRP report is that existing CRM training has not been as successful as many people would claim [8].

DON’T REMIND ME

The previous section argued that many incident reporting systems reiterate well-known solutions to well known problems. It is important not to dismiss the value of this for many safety-critical environments. Incident reporting systems play an important role in raising awareness about safety issues. By documenting incidents that have occurred to others, the intention is to alert colleagues to the potential for future failures. Publications such as Feedback and Callback, as well as DirectLine and the ASRS Operational Bulletins, play a critical role in increasing the visibility of these common occurrences:

"We have also received reports complaining about "draconian" requirements for the wearing of High-visibility jackets that have been placed on flight crew at some UK airports. As an example, it has been questioned whether it makes sense to require all flight crewmembers to wear a High-visibility jacket when proceeding as a crew along passenger walkways to/from an aircraft and yet permit large numbers of passengers to transit the same ramp area, with minimal supervision, sometimes led by a single airport staff member? The wearing of High-vis clothing by all crewmembers, when on the ramp area, is a prudent safety policy. However, a safety policy such as this should be consistently applied. It is difficult to equate the requirements placed on flight/cabin crews at some airports with those for passengers, many of whom are far less aware of the potential dangers that are ever-present on the ramp.

Inconsistent rules invariably lead to poor observance and thus may fail to provide the protection for which they were intended. If you don't like the rules as they are, involve your management in having them formally reviewed.

The key rule for everyone who uses the ramp must be always to remain alert to potentially dangerous situations at all times, particularly when the weather and/or ramp surface make working conditions unpleasant. If in doubt - wait a few seconds longer - a rotating jet/propeller invariably wins any contest."

(CHIRP Feedback, Issue No. 53, January 2000)

The reiteration of well-known safety recommendations also raises a number of more fundamental concerns about the utility of incident reporting systems for the long-term safety of many applications. There is a considerable body of human factors research that points to the dangers of any reliance on reminders. Unless people are continually reminded then they are likely to forget the importance of safety precautions over time [9]. The previous excerpt was chosen carefully because ramp safety is an area where there has been, and continues to be, a high level of injuries and accidents. The incidents recur frequently within both the ASRS and the CHIRP reports and the same advice continues to be offered to broadly similar incidents. For example, this topic was addressed in DirectLine no. 8 and in Feedback no. 53. It might, therefore, be argued that sustained action to remove the fundamental causes of these injuries and accidents would be more effective than the repeated reminders offered by incident reporting systems.

WHAT IS THE APPROPRIATE FORUM?

The previous paragraphs have argued that incident reporting systems help to raise awareness about common safety concerns. They have also argued that by emphasizing short-term, well-known remedies, there is also a danger that they may take attention away from the underlying usability problems that affect many aspects of the aviation industry. Of course, it can be argued that the individuals who operate the systems, fly the aircraft and submit the reports can do little to address these deeper concerns. An individual pilot or air traffic control officer can do little to change the design of the complex systems that they operate. Callback and Feedback might not, therefore, provide an appropriate forum for discussing the root causes of aviation incidents.

These arguments are countered by the ASRS’ remit to "1.Remedy reported hazards, 2.Conduct research on operational safety problems, and 3.Facilitate a better understanding of aviation safety issues" (http://asrs.arc.nasa.gov./overview.htm, Feb. 2000). This quotation presents some interesting distinctions. The previous two citations from Feedback and from Callback illustrate the concern to remedy reported hazards in a direct manner. However, they do not show how incident reporting systems can be used to improve our more general understanding of aviation safety issues and, in particular, to raise awareness of usability concerns. Fortunately, ASRS’ annual DirectLine publication and the less frequent Operational Issues Bulletin, look at the more generic issues that arise in aviation incident reports.

"The type of confusion experienced by this flight crew over their (Pre-Departure Clearance) PDC routing is potentially hazardous, as noted by a controller reporter to ASRS: "It has been my experience...that several times per shift aircraft which have received PDCs with amended routings, have not picked up the amendment...I have myself on numerous occasions had to have those aircraft make some very big turns to achieve separation." (ACN # 233622). The sources consulted by ASRS suggested several potential solutions to this problem:

-- SFO 6 SFO LIN J84 MVA J198 ILC --

SFO LIN OAL J80 ./. BWI

Show it this way:

-- SFO 6 LIN J84 MVA J198 J80 ./. BWI

ASRS Operational Issues Bulletin 96-01, December 1996.

The previous citation from the ASRS’ Operational Issues Bulletin provides a more detailed causal analysis of incident reports. The annual DirectLine reports publish the results of similar studies. These two publications address the concerns that were raised in previous sections. They "facilitate a better understanding of aviation safety issues". However, they have a more limited audience than Feedback or Callback. Research into the effectiveness of incident reporting systems in other domains has stressed the importance of providing the individuals who contribute to an incident reporting scheme with direct feedback about the effect of their contributions on fundamental and long running concerns. For example, incident reporting systems have had little impact on workload issues in the aviation industry [7]. Neither does there seem to be any decline in the usability problems that recur with each new generation of computer systems [9].

ADDRESSING CULTURAL PROBLEMS?

There is no counterpart to the ASRS Bulletins within the CHIRP scheme. However, there are good examples of more sustained causal analyses of usability issues within Feedback. For example, the following except draws attention to both the usability and the safety issues that can arise from poorly designed computer systems.

"At the start of the Winter heavy maintenance programme, the company railroaded into place a computerised maintenance and integrated engineering and stores, planning and labour recording system. No training was given on the operational system only on a unit under test. Consequently we do not look at airplanes any more just VDU screens, filling in fault report forms, trying to order parts the system does not recognise, as the stores system was not programmed with (aircraft type) components (the company wanted to build a data base as equipment was needed).

When the computer informed us the 'C' check was complete and issued the CRS certification forms, I requested a task and certification report so I could convince myself that the work had in fact been recorded correctly. I was told this couldn't be done. After refusing to release the aircraft, the systems people managed to miraculously find one. The record had numerous faults, parts not recorded as being fitted, parts removed with no replacements, parts been fitted two or three times, parts removed by non-engineering staff, scheduled tasks not called-up by planning, incorrect trades doing scheduled tasks and certifying, and worst of all the record had been altered by none certifying staff after the CRS signatories had closed the work.

Quality Airworthiness Department were advised of these deficiencies and shown actual examples. We were advised by the management that these problems are being addressed but they are not, we still have exactly the same problems today. What am I to do without losing my job and career. In a closed community like aviation, troublemakers and stirrers do not keep jobs and the word is spread around. If I refuse to sign the CRS somebody who has not worked on the aircraft will be found to clear it (contravention of ANO?). (Air Navigation Order)…

The Company concerned was approached on this issue and responded that they had become aware of the difficulties being experienced. At the time this report was discussed they were just introducing a scheme whereby staff could report problems and get feedback on progress as part of their policy to encourage an open reporting culture. The certification procedures were specifically addressed. However, this would appear to have been another example of a complex computer system being introduced, or upgraded, without ensuring that the staff, who ultimately have to operate it, being consulted and trained properly at the outset."

(CHIRP Feedback, Issue No: 49 January 1999).

The comment that aviation is a "closed community" and that "troublemakers and stirrers do not keep jobs" captures a feeling that is often repeated in confidential interviews and in published incident reports. Some authors have claimed that incident reporting schemes help to encourage a more open culture [1]. Recent moves to develop incident reporting schemes within the UK railways and within many hospitals have been justified by claims that they will protect individuals who point out potential threats to safety. Incident reports, such as the one cited above, illustrate that these schemes can do very little to effect the underlying safety culture of the companies and organizations within particular industries.

CONCLUSION AND FURTHER WORK

Incident reporting schemes help to ensure that failures in existing systems are not propagated into future applications. The success of the FAA/NASA’s Aviation Safety Reporting System (ASRS) and of the independent Confidential Human Factors Incident Reporting Programme (CHIRP) has led to the development of similar schemes throughout the world. While these initiatives have made significant contributions to aviation safety, this paper looks for further ways in which we might improve the reporting of human error:

  1. Many of the publications that are provided to the individuals who contribute incident reports tend to focus on direct solutions to their everyday problems. There is a danger, however, that these short-term panaceas will obscure deeper recurring safety issues, such as the continuing usability concerns with many computer applications in the aerospace industry.
  2. Many reporting systems address the concerns described above by publishing overviews that address the common causal features in several incidents. CHIRP has introduced editorial overviews. The ASRS has sister publications such as DirectLine. Research into incident reporting systems in other domains stresses the importance of providing the individuals who contribute to an incident reporting system with direct feedback about the effect of their contributions on long running concerns as well as short-term fixes [1, 3].
  3. Different national systems have offered different forms of advice to crews and engineers in response to very similar incidents. There are few examples of direct contradictions, however, there are many cases of inconsistency in the advice that is offered to pilots, air traffic controllers and engineers.
  4. Many reporting systems re-iterate the importance of solutions that are recommended as part of standard training activities. The need for such reminders, and the incidents that generate them, raises wider concerns. For instance, some training practices, such as CRM [8], have failed to address many of the problems that continue to be reported with cockpit communication and coordination.
  5. There is a considerable body of human factors research that points to the dangers of any reliance on the reminders that are mentioned in the previous paragraph. Unless people are continually reminded then they are likely to forget the importance of safety precautions.
  6. Incident reporting schemes have not had a radical effect on the underlying safety culture of some/many companies and organizations within the aviation industries.

ACKNOWLEDGMENTS

Thanks are due to the members of the Glasgow Accident Analysis Group and to the Glasgow Interactive Systems Group.

REFERENCES

  1. W. van Vuuren, Organisational Failure: An Exploratory Study in the Steel Industry and the Medical Domain, PhD thesis, Technical University of Eindhoiven, Netherlands, 1998.
  2. C.W. Johnson, Using Case-Based Reasoning to Support the Indexing and Retrieval of Incident Reports. To appear in Proc. of European Safety and Reliability Conference Edinburgh, Scotland, March 2000.
  3. D.K. Busse and C.W. Johnson, Human Error in an Intensive Care Unit: A Cognitive Analysis of Critical Incidents. In J. Dixon (editor) 17th International Systems Safety Conference, Systems Safety Society, Unionville, Virginia, USA, 138-147, 1999.
  4. T. van der Schaaf, Human recovery of Errors in Man-Machine Systems. In CCPS’96: Process Safety Management and Inherently Safer Processes, American Institute of Chemical Engineers, Florida, USA, 1996a.
  5. J. Reason, Managing the Risks of Organisational Accidents, Ashgate, Aldershot, UK. 1998.
  6. C.W. Johnson, Supporting The Analysis Of Human Error In National And International Incident Reporting Schemes. Submitted to the 19th European Conference on Human Decision Making and Control, Ispra, Italy. 2000.
  7. C.E. Billings, Aviation Automation. Lawrence Erlbaum Associates, New Jersey, USA, 1997.
  8. C.W. Johnson, Reasons For The Failure of CRM Training. Draft - available on request.
  9. V.D. Hopkin, The Impact of Automation on Air Traffic Control Specialists. In M.W. Smolensky and E.S. Stein (eds.) Human Factors in Air Traffic Control, Academic Press, San Diego, USA, 1998.