Human Factors Engineering is the Basis for a Practical Error-in-Medicine Curriculum

J.W. Gosbee, Center for Applied Medical Informatics, Michigan State University Kalamazoo Center for Medical Studies, USA

Keywords: error curriculum, human factors engineering, error countermeasures, teaching

 

Abstract

It is essential that health care practitioners and students learn the underlying reasons for their mistakes and appreciate appropriately designed countermeasures. During training, medical practice, and administrative duties, health care personnel need to have a constructive attitude toward and knowledge about error-in-medicine. However, many clinicians are led to believe that humans are perfectible, and the "blame and train" approach is the optimal route to improving patient care. In contrast, a practical error-in-medicine curriculum — based in human factors engineering — has been taught to over 300 medical, nursing, and pharmacy students during health care informatics courses. The three main aims of the "error" curriculum are to have the students: 1) become an asset to organizational quality improvement activities; 2) become better evaluators and selectors of medical software and devices; and 3) assist and encourage feedback to medical software and device companies. Over five years, the teaching sessions have become more problem-based, simplified, and fun. Short and long term student evaluations (interviews and questionnaires) have been very positive. Many practical lessons have been learned which can support dissemination of this curriculum. The main points of this paper are to show that teaching about error-in-medicine is necessary, effective, and relatively easy.

 

Introduction

There are significant preventable adverse events throughout the health care system. Large retrospective chart reviews in the eastern United States revealed 3.7 percent of patients admitted to the hospital suffer an adverse event (ref. 1). Prospective studies in an Israeli intensive care unit uncovered more than one preventable error per patient, per day (ref. 2). Almost half of the fifty-three family physicians interviewed admitted to a mistake that led to a premature patient death (ref. 3). Books have been written on the subject of error in medicine (ref. 4) and a myriad of organizations created with the purpose of improving patient safety (refs. 5-7).

There is no lack of ambitious and comprehensive ideas to identify and reduce errors in health care settings. Leape (ref. 8) has suggested that error interventions need to occur at nearly every stage and level of clinical practice. From studies in many high risk industries, including medicine, Vincent (ref. 9) cites the limited usefulness of safety initiatives that are too narrow. Vincent proposes a framework for analyzing risk and safety in health care that also recommends a wide assessment. Finally, continuous quality improvement methods and systems thinking are touted throughout the United States and the world (ref. 10).

Will any of these efforts come to fruition without error-in-medicine education of researchers, educators, and, most importantly, health care personnel? If we are going to "attack" error and its root causes on "all fronts" will we not need thousands of recruits within all aspects of medical care? Most agree that awareness of causes and solutions of error in medicine will encourage and improve participation in quality improvement activities. However, two other important reasons for this education relate to the human factors engineering discipline. First, the learner needs to gain an interest and ability to assist and encourage feedback to medical software and device companies. This is actually required by organizations like the United States Food and Drug Administration (ref. 11); and done voluntarily through organizations like ISMP (ref. 5). Second, the learner needs to become a better evaluator and selector of medical software and devices. In other words, the training will need to improve the learner’s consumer skills to "push" industry to adopt human factors engineering methods during design, which results in fewer latent errors (ref. 12-13).

Error-in-medicine training could fall in many places within the student’s required course work. However, many aspects of understanding human error involve information handling and decision making. Therefore, it makes sense to include this training within a health care informatics curriculum. Since 1995, the educational activities described below have been used for teaching over 150 students and residents during a month-long medical informatics rotation at Michigan State University Kalamazoo Center for Medical Studies (ref. 14). The curriculum has also been used for the "Drug Information" month-long rotation for doctor of pharmacy students at Ferris State University (Big Rapids, MI, USA). Finally, the human error curriculum is tailored to be an integral part of a "Nursing Informatics" course at Western Michigan University (Kalamazoo, MI, USA).

The main points of this paper are to show that teaching about error and medicine is necessary, effective, and relatively easy.

Similar Curricula

Teaching about the nature and reduction of human error within the context of human factors engineering is already done in universities, government, and industry. Many universities offer an introductory course for students seeking degrees in psychology, engineering, and computer science (ref. 15). Some have tried to identify the commonality in teaching to semi-interested students (i.e., human factors engineering is not their major area of study) (ref. 16). Organizations and industry in high risk domains like aviation and nuclear power provide ongoing training courses relating to error and error countermeasures for many of their workers. Some of this curriculum and format for teaching could be applied to health care students.

Studies on teaching human error and human factors engineering principles to health care personnel and students are limited. In the United Kingdom, a study was done with 129 nurses to see how their coping mechanisms with errors affected their learning mechanisms (ref. 17). Pilpel and his colleagues in Israel aimed their teaching program at medical students to "impart a tolerance of error" (ref. 18). They posited, but had no outcomes data, that teaching about the inevitability of error and how it can be reduced would lead to acceptance and, subsequently, to candid reporting. They also theorized that the curriculum would help the students realize their doubts and fears were shared by peers and superiors. From the interview data of 53 physicians, Ely and his colleagues concluded that physicians who understand common reasons for mistakes may be better equipped to prevent recurrence (ref. 3). Gaba and other groups are using anesthesia simulators and principles of human factors engineering to teach residents and practicing anesthesiologists (ref. 4). The continuing education is often done over a few days of simulating critical incidents with informed reviews of the videotaped sessions. Finally, Wu and his colleagues have written several articles on their study of medical residents and their errors, as well as strategies for educators to teach about and cope with those situations (refs. 18-19).

Requirements and Regulations in the United States

Professional groups and governing bodies for training medical doctors, nurses, and pharmacists have addressed the need to teach about the nature and reduction of error in medicine. Governing bodies for graduate and undergraduate medical education in the United States have several requirements for teaching quality assurance, continuous quality improvement, and systems thinking. The enumeration of these is beyond the scope of this article. The American Nurses’ Credentialing Center, partnered with American Nurses Association, lists human factors engineering (which would include error) as one of seven areas of competency (ref. 20). The American Society of Health-System Pharmacists cites the human factors engineering training of pharmacy directors as one of seven key strategies to reduce adverse drug events in hospitals (ref. 21). The Joint Commission on Accreditation of Healthcare Organizations may require the contribution of all these health care personnel in root cause analysis of sentinel (significant) events (ref. 22). Lastly, the United States Food and Drug Administration comes just short of requiring the training of physicians and nurses to participate in reporting and analyzing errors that occur with medical devices (ref. 23).

Overview of the Error Teaching Sessions

The optimal aims of an error-in-medicine curriculum are to have the students: 1) become an asset to organizational quality improvement activities; 2) become better evaluators and selectors of medical software and devices; and 3) assist and encourage feedback to medical software and device companies. Within the resources and practical time constraints at Michigan State University Kalamazoo Center for Medical Studies (MSU/KCMS) and Western Michigan University (WMU), the objectives of their error-in-medicine teaching sessions are for students to:

1) Understand the scope and gravity of error in health care settings

2) Gain a familiarity with human perceptual limitations and cognitive biases, and learn that they are uncontrollable, yet very predictable

3) Know theoretical and practical reasons why "blame and train" and "bad apples" approaches fail

4) Understand importance of discovering root cause towards proper countermeasures

5) Become familiar with human factors engineering and continuous quality improvement techniques that determine root causes and help design countermeasures

6) Understand major categories of error countermeasures, especially the role of computers

7) Understand limitations and pitfalls of automation as a countermeasure

8) Understand that some latent errors and systemic problems are exacerbated by poor design

The fourth-year medical students and residents began to receive error-in-medicine training in 1994 within a month-long medical informatics elective rotation at MSU/KCMS. The doctor of pharmacy students began taking their required month-long drug information month along with the medical students and residents in 1997. Groups of two to six students have reading assignments, home work, and two two-hour teaching sessions early in the month. The first session is an introduction to human factors engineering with an emphasis on error and user-centered software design. The second two-hour session is devoted entirely to error, and is described in full below. The entire month of the medical informatics curriculum has been described previously (ref. 14), and will soon be located on the Internet (www.kcms.msu.edu). Almost none of our students receive education regarding errors previous to this elective, except some brief exposure to quality improvement principles.

Since 1996, the nursing students have been taught about error-in-medicine as part of a course in "Nursing Informatics" at the School of Nursing, Western Michigan University (Kalamazoo, MI, USA). In summary, they receive three consecutive one-hour sessions, including: 1) human factors engineering and nursing computer systems and devices; 2) small group system evaluation exercises; and 3) human error and nursing. As with the month-long rotation, the first session is an introduction to human factors engineering with an emphasis on error and user-centered software design. The small group exercises support concepts in both the first and third hour. Most nursing students have had several months of clinical experience.

Early Evolution of the Curriculum

When the error-in-medicine teaching sessions began in 1994 for the medical residents and students, much of the curriculum was adapted from introductory human factors textbooks and mostly didactic (ref. 24). The lecture started with a detailed presentation of three different types of error-in-medicine journal articles (ref. 1-3). For the remaining lecture, an overview of error theories, applicable cognitive psychology theories, taxonomies of errors, and a listing of countermeasures were presented. Some of the material adapted from textbooks seemed to be based on the needs and mind sets of psychology or engineering students.

Early Feedback and Adult Learning Principles: Most of the early feedback from course surveys and informal interviews was positive. For example, one of the students said, "[It] gave me a better understanding of medical tools and analysis." However, some learners found the lecture material interesting, but not relevant. Others did not ask many questions or interact during learning sessions. This experience and literature on adult learning principles (ref. 25) prompted a change towards more relevancy, realness, simplicity, and added fun. Some of this evolution is similar to problem-based learning changes that were being made at that time in the Michigan State University College of Human Medicine curriculum (East Lansing, MI, USA). Finally, lessons on format, readings, and exercises were borrowed from studies of introductory human factors course effectiveness (ref. 26). With these considerations, experimentation with elements of the course and improvement continued through several sessions, since the material is taught 10-12 times per year.

Curriculum Description

Self Study: Required chapters from Norman (ref. 27) and Casey (ref. 28) provide very readable introductions. These books are very popular in introductory human factors courses, and highly recommended to any audience new to these principles. Both books use everyday examples. Norman weaves them into the basic principles of user-centered design and human factors engineering. Casey tells several riveting short stories about all sorts of disasters, including a few medical ones, to implicitly illustrate the human cost of error-exacerbating system design.

Common Highlights of the Teaching Sessions: The session begins with a review of two Internet sites that contain interesting and obvious attempts to catalogue and mitigate errors in high risk situations: kayaking and SCUBA diving (refs. 29-30). Then, some students are asked to recount their everyday encounters with slips, mistakes, and errors. These are noted for discussion at appropriate times during the session. Next, data from Brennan (ref. 1), Ely (ref. 2) and Gopher (ref. 3) are used to prompt a sobering discussion on the scope of error in health care. The students then give their definition of human error, these definitions are written for the whole class to see, and the instructor highlights key words and phrases for comparison and contrast. To stimulate discussion, the instructor shows how some people can claim that error doesn’t exist, and that "error" is simply normal human performance. To further these ideas, a radar screen and vigilance graph are presented to show error is part of normal performance. The vigilance graph is a simplified result of several subjects watching a radar screen for "enemy ships" in a laboratory situation, and performance decreases over several hours. Following this, the graph is redone so the "y" axis shows errors (missed ships), rather than per cent of ships detected, to demonstrate that error can be considered the "flip-side" of human performance.

The major portion of the teaching occurs when four to five students list and comment on actual experiences with error in health care. They are asked not to identify their role in the error, but they need to know enough detail to describe the circumstances leading to the event and the clinical and personal consequences. After each description is listed for all to see, the instructor helps the class determine the root cause(s) and possible countermeasures. During this analysis, s/he interjects several principles, including: findings from human error/performance theory and taxonomy; methods for root cause analysis; countermeasure types (e.g., warnings, interlocks); and the limitation of error theories and study methods.

An example would be: Incident description: ordered Clonidine 1.0 mg, instead of 0.1 mg in middle of night, which led to the patient falling down and pharmacist alerting physician that dose was x10 in the morning. Clinical and personal consequence: patient was nervous with a few bruises, but no lasting effects; the resident was embarrassed and reprimanded, but the rest of "team" ignored it. Root causes: fatigue; inexperience; and missed steps and checks in getting and giving medications. Possible countermeasures: computerized order entry that checks for unusual doses; and the resident must be standing when giving phone orders at night.

In these courses, a major point of emphasis is the limitations and pitfalls of using computers as countermeasures (fixes) for errors. Many in health care have identified the role of computers in addressing errors, including medication errors (ref. 21) and as reminders for oft forgotten clinical interventions (ref. 31). Unfortunately, many computer systems solve the wrong problem or do not address the error in a usable manner. At best, the computer does nothing to eliminate target errors. At worst, it introduces insidious new problems (ref. 32). A recent empirical study showed that only 4 out of 307 pharmacy computers correctly identified all unsafe medication orders in a field test (ref. 33). The researchers suggested the root causes of the failures were: complex programming, hard-to-use human-computer interfaces, and unrealistic time commitments needed to properly to maintain and use the systems. Most importantly, the students are given those rare medical informatics articles where the investigators have appropriately used human factors methodologies to develop information systems (ref. 34).

At the end of the session, all the lessons are summarized using the student’s error stories and a "Swiss cheese" diagram adapted from Reason (ref. 35). All of the countermeasures at the level of the device (or software), the clinician, and the organization are represented as three pieces of swiss cheese (i.e., barriers), with holes of various sizes, and arranged in a row. The "arrow" of an observed adverse event can only make it through if all the holes line up. The sizes of the holes are redrawn smaller to demonstrate how improved device design, improved clinician knowledge and error awareness, and continuous improvement of the organization can all foil adverse events. In contrast, the holes for device or software are redrawn much larger to show that many existing systems were not designed with human factors methods. This, in turn, shows how the "pressure" is on individuals and organizations to keep "holes" small. In addition, it illustrates how they should be "pushing" manufacturers to use human factors engineering methods to create "Cheddar" instead of "Swiss" devices and software.

Finally, a customized group of case studies and hand-out materials are provided, depending on class composition. For example, a recent newsletter from the Institute for Safe Medication Practices is provided to doctor of pharmacy students (ref. 5). Or, during the nursing courses, stories encountered directly by the instructor are recounted, such as catastrophic errors with IV pumps used by home health nurses.

Another View of the Curriculum: Even with customization and progressive changes, there are several common elements to the teaching of error and health care. A useful way to view the error and medicine teaching activities is to organize them across four important considerations: real, relevant, illustrative and simple, and fun. Most activities or elements of the curriculum are organized in this manner in Table 1. As mentioned above, these considerations arise from the experimentation on the content and format (the basic session has been taught over 50 times).

 

 

Real

Relevant

Simple (illustrative)

Fun

• The students tell their stories of clinical errors

 

• The students are given current case studies of medication and device mishaps from the ISMP newsletter

 

• Presentation of current case studies (medical journals or popular press)

 

• Students are usually involved in system evaluation projects where errors occur during testing

• The students tell their stories of clinical errors

 

• There is purposeful emphasis on case studies that end in death or maiming

 

• Presentation of case studies from literature within their specialty

 

•Students are usually involved in system evaluation projects where errors occur during testing

• The students are asked for recent, everyday examples of error

• "Best-of" case studies, that are not current or relevant (e.g., coffin-shaped pills not for internal use)

• Swiss cheese diagram adapted (simplified) from Reason

• Performance and error graphs from the radar example

• Small group exercises to evaluate errors with common items

 

• Humor by instructor and class

 

• Review of Internet databases on kayak and SCUBA diving deaths and injuries

Table 1 - Elements of Curriculum Organized Across Four Important Parameters

 

Additional Exercises and Teaching Sessions: Although they are briefly covered in the error teaching sessions, there are many topics and skills that are reinforced. Decision making errors and use of clinical decision support systems are also covered in later sessions (ref. 36). Projects assigned for the month or semester are mainly evaluation of information technology or devices with human factors methods. This can range from field observation to usability testing with low-fidelity prototypes. The students are also assigned to ongoing projects as observers or data collectors. In some cases, the students are given self-study lab exercises, such as short human factors evaluation of an IV pump prototype. In the nursing classes, all of the students do a brief, yet real, evaluation of a familiar system (e.g., Legos(TM) toy, calculator, pocket-sized lens cleaning kit). Lastly, the pharmacy students are specifically required to learn about processes to analyze and report medication adverse events (ref. 23), and occasionally medical students and residents learn with them.

 

 

Ongoing Evaluation of the Curriculum

Medical Student and Residents Comments: Given the small size and frequency of the teaching sessions, constructive comments are the most common and useful feedback mechanism. During class, many of the students comment that this was the first time they understood why they might be involved in quality improvement efforts at the hospital or clinic. Some of the students and residents "breathe a sigh of relief" when they see the theory behind the mistakes they make, how some of the errors are exacerbated by poor design, and the potential for properly designed information systems to help them. In surveys at the end of the rotation, some have noted that the error or human factors teachings were the most unique and surprisingly useful. Unsolicited, a few have even said that the information should be a required part of early medical school courses, and that they were "mad" that it took so long to hear about the topic. It should be noted that formal surveys after each teaching session have recently been implemented, and the data so far are very positive.

Follow-up Interview with Medical and Pharmacy Students: An in-depth semi-structured phone interview was conducted with a convenience sample of six residents, pharmacy and medical students who completed the month long rotation. The six students had taken the rotation six, eight, nine, ten, twelve, and twenty-two months previous to the interview. Two physicians and two pharmacists were now full-fledged practitioners. Two were now resident physicians. All remembered the error and medicine lecture and class exercises, but not to any level of detail. All but one of them said that the change in attitude about system design had persisted. The most vivid memory for four of them was their new understanding about the lack of user-centered design, and how this deficiency led to hard-to-use systems and errors. Some of the more illustrative quotes from the interviewees were: "it helps you attack the problem [of error], instead of avoid it"; "I think I was very impacted by your course"; "I remember thinking it [the course] was very helpful...stuff that was thought to be common sense does need study"; and "computers and technology needs to be usable and user-friendly or [it will be] rejected". The participants’ permission to use these quotes is on file.

Nursing Informatics Students: A semi-structured questionnaire was completed by the four nursing informatics classes that received human factors and human error teaching sessions in 1998 and 1999 (N=67). The intent of the questionnaire was not research, but for continuous improvement of the sessions. Tables 2 and 3 below summarize the number and per cent of responses for the two questions: "What was the most useful portion of the class tonight?"; and "What was the least useful portion of the class tonight?"

Responses

Frequency

Per Cent of Total

Class exercises were a good learning experience, allowed clarification of the ‘human error concept’, or provided insight into the human factors process

36

54%

Case studies helped to analyze why systems are not working in health care, they gave insight into problems that occur, or showed that systems often fail

24

36%

The first lecture helped apply human factors to nursing practice

5

7%

Slide handouts aided understanding or helped students follow the slides

2

3%

Total Comments

67

100%

Table 2 - Most Useful Portions of the Class

 

 

Responses

Frequency

Per Cent of Total

Nothing was "least useful"; all portions of the class were useful

26

53%

I either couldn’t hear or see the video or didn’t understand what it was about and needed a better example of observer evaluation

5

11%

History portion in the first lecture

4

9%

Class Exercises were geared more towards engineers and not health professionals; Didn’t need 30 minutes to evaluate the [Legos(TM)] toy

2

4%

Not used to having multiple breaks in one class

2

4%

Sometimes difficult to follow the outline, repetitive

2

4%

Inadequate explanation of Human Factors and the concept was difficult to understand

2

4%

Miscellaneous (one each): improve exercises, ‘dry’ notes, discussion of examples, too long, human factors information was unnecessary

5

11%

Total comments

48

100%

Table 3 - Least Useful Portions of the Class

It is clear from Table 2 that class exercises and case studies are most preferred. The results from Table 3 are of such low frequency and scattered to make much of a claim. It is of note that 54 of 67 (80%) nursing students either filled in nothing or explicitly stated that there was no "least useful" part of the training.

Anecdotally, other nursing faculty have told the nursing informatics instructor that they can tell when students have taken the nursing informatics course because the students will use phrases like, "that's a human factors problem." Students were also heard to say that the course helped them recognize that they, themselves, are not the problem when they are slow to learn computers. Instead, they recognize that the problem is very likely poor design (ref. 37).

Major Issues

The toughest issue for most new curriculum ideas in health care education is the problem of how to fit them into an already full curriculum. As described above, error-in-medicine teachings can be added where it best fits into a fourth-year medical school elective, a residency elective, or a required course. Without better learning outcomes data, it is hard to claim this approach is sufficient. People that champion other new medical curriculum ideas, like computer training and evidence-based medicine, are trying to include new courses, or alter required ones, early in training. It would make the most practical sense to offer error-in-medicine training before students began working with patients in hospitals and clinics. Given the number of hours devoted to such training in other domains, the curriculum should probably be longer than three or four hours.

Another recently highlighted criticism by Casarett and Helms is the possibility that the systems approach to error may teach irresponsibility or laziness (ref. 38). The authors suggest that if residents are able to put blame on "systems", they may not work as hard to modify their lack of knowledge. First, this assumes that the main reason for this systems error and medicine training is to encourage and improve participation in quality improvement activities. In the curriculum above, supporting continuous quality improvement activities is only a small part of the overall aims (goals). Second, the authors describe a limited root cause analysis, which only addresses organizational and other flaws outside the control of the resident. Human error is not either the system or the person, it is almost always a combination of root causes that need to be carefully analyzed, and deficiencies appropriately fixed. In the error-in-medicine teaching at MSU/KCMS, the aim is to teach residents to see when errors are due to inadequate skills and knowledge versus when they are due to inherent cognitive limitations and biases. Finally, it is not the intent to have error-in-medicine training and causal analysis stop at self or organizational improvement. The learners should get improved consumer skills to "push" industry to adopt human factors engineering methods during design (ref. 39). This, more than any regulation or journal article, will lead to less error-prone medical software and devices.

Practical Lessons Learned

There are several things one should know to prepare for and consider while teaching error-in-medicine to almost any health care student or practitioner. You need to change the learner’s perspective and preconceptions first, or many will not listen to the remaining lessons. This is usually not too difficult because the existing perspective is not ingrained. As described above, the case studies, examples, and references have to be as relevant as possible for the type of health care student, and their specialty (e.g., medication errors for pediatric ICU for a pediatric resident). Be prepared for learners who are resistant or arrogant. Focusing on how improving the system helps their "less fortunate" colleagues is one method. With more time, you can carefully make them use error-prone devices and software, and then show them how a human factors redesign diminishes such errors. Be prepared for learners who get stuck on worrying about litigation, and how "nothing", even human factors engineering redesign, will help stop the lawyers. You can address this helplessness by noting the fact that there is little overlap in the subsets of malpractice suits and preventable adverse events (ref. 1, 4). You can also state that systems thinking and human factors engineering analysis are their duty as practitioners as keys to offering the very best care. Finally, you will be tempted to overwhelm students with too many references, complex theories, and tangential examples. Stick with a few examples and theoretical points that you know very, very well. Unless they are getting an advanced degree in this area, the textbooks on error or human factors engineering are too much.

If you are training the educators who will train students, or are preparing to teach error-in-medicine yourself, please consider these recommendations. The educator will get negative reactions from some, so novice teachers should prepare themselves with on-hand references (see above). It is easy to get confused about the major types of errors (slips and mistakes) and countermeasures (e.g., warnings, training, interlock, etc.). The teachers must have a user-centered, human factors engineering philosophy themselves or the lessons will not ring true. Teachers will have to recall their own "attitude change" to foster one in the students. The teacher must be able to deal with ethical and legal issues (ref. 18-19). Specifically, the student might ask how to now report an error or work through a root cause analysis in their hospital or clinic. The contact person or standard procedures will need to be readily available. The teacher should practice being able to coach students on how to approach superiors about incidents or near misses. Finally, plan for a few learners who will have significant emotional reactions from their memories of incidents or the discussion of the material, in general.

Conclusions and Next Steps

The error-in-medicine training for doctors-, nurses-, and pharmacists-in-training is easy, fun, effective, and desperately needed. The author of this paper offers his support to those that want to try it. In turn, some feedback is appreciated. Some health care organizations might not realize they have people with human factors engineering or error-in-medicine backgrounds in their midst (e.g., anesthesia, pharmacy, or radiology departments). Others might try asking faculty or graduate students from their local (friendly) university department (listed at ref. 15, 40, 41). It is hard to imagine how all the ambitious quality and safety programs around the world will succeed unless health care personnel have a constructive attitude toward and knowledge about error-in-medicine.

 

References

1. Brennan, T.A., L.L. Leape, N.M. Laird, et al., 1991. Incidence of Adverse Events and Negligence in Hospitalized Patients: Results of the Harvard Medical Practice Study I. New England Journal of Medicine 324: 370-6.

2. Gopher, D., M. Olin, Y. Badhi, et al., 1989. The Nature and Causes of Human Errors in a Medical Intensive Care Unit. In Proceedings of the Human Factors and Ergonomics Society 33rd Annual Meeting. Santa Monica, CA: Human Factors and Ergonomics Society.

uHh

3. Ely, J.W., W. Levinson, N.C. Elder, and A.G. Mainous, 1995. Perceived Causes of Family Physician’s Errors. Journal of Family Practice 40(4): 337-44.

4. Bogner, M.S. 1994. Human Error in Medicine. Hillsdale, NJ: Lawrence Erlbaum Associates, Inc.

5. Institute for Safe Medication Practices. 1999. [online]. Warminster, PA. Available from World Wide Web: www.ismp.org.

6. Anesthesia Patient Safety Foundation. 1999. [online]. Pittsburgh, PA. Available from World Wide Web: http://apsf.med.yale.edu.

7. National Patient Safety Foundation. 1999. [online]. Chicago, IL. Available from World Wide Web: www.npsf.org.

8. Leape, L.L., 1994. Error in Medicine. Journal of the American Medical Association 272: 1851-57.

9. Vincent, C., S. Taylor-Adams, and N. Stanhope, 1998. Framework for Analyzing Risk and Safety in Clinical Medicine. British Medical Journal 316: 1154-1157.

10. Berwick, D.M., 1996. A Primer on Leading the Improvement of Systems. British Medical Journal 312: 619-622.

 

 

 

11. Food and Drug Administration Safe Medical Devices Act of 1990, 1995. Medical Device User Facility and Manufacturing Reporting: Certification and Registration (21CFR Part 803). Federal Register 60(237): 63578-63607.

12. Burlington, D.B., 1996. Human Factors and the FDA’s Goals: Improved Medical Device Design. Biomedical Instrumentation Technology 30(2): 107-109.

13. Lin, L., R. Isla, K. Doniz, H. Harkness, K.J. Vicente, and D.J. Doyle, 1998. Applying Human Factors to the Design of Medical Equipment: Patient-controlled Analgesia. Journal of Clinical Monitoring and Computing 14: 253-263.

14. Stahlhut, R.W., J.W. Gosbee, and D.J. Gardner-Bonneau, 1997. A Human-centered Approach to Medical Informatics for Medical Students, Residents, and Practicing Physicians. Academic Medicine 72(10): 881-887.

15. Human Factors and Ergonomics Society. 1999. [online]. Santa Monica, CA. Available from World Wide Web: www.hfes.org.

16. Gibb, R.W., 1998. Undergraduate Human Factors Curriculum and Introductory Course Content. In Proceedings of the Human Factors and Ergonomics Society 42nd Annual Meeting. Santa Monica, CA: Human Factors and Ergonomics Society.

17. Meurier, C.E., C.A. Vincent, and D.G. Parmar, 1997. Learning from Errors in Nursing Practice. Journal of Advanced Nursing 26(1): 111-119.

18. Wu, A.W., S. Folkman, S.J. McPhee, and B. Lo, 1991. Do House Officers Learn From Their Mistakes? Journal of the American Medical Association 265: 2089-2094.

19. Wu, A.W., S. Folkman, S.J. McPhee, and B. Lo, 1993. How House Officers Cope With Their Mistakes. Western Journal of Medicine 159: 565-569.

20. Healthcare Information Management Systems Society. 1996. Guide to Nursing Informatics. Chicago, IL: Healthcare Information Management Systems Society.

21. American Society of Health-System Pharmacists, 1996. Top-priority Actions for Preventing Adverse Drug Events in Hospitals: Recommendations From an Expert Panel. American Journal of Health-System Pharmacists 53: 747-51.

22. Joint Commission on Accreditation of Healthcare Organizations. 1999. [online]. Oakbrook Terrace, IL. Available from World Wide Web: www.jcaho.org.

23. Food and Drug Administration, 1995. Medical Devices: Medical Device User Facility and Manufacturing Reporting: Certification and Registration (21CFR Part 803). Federal Register 60(237): 63578-63607.

24. Sanders, M.S., and E.J. McCormick. 1993. Human Factors in Engineering and Design. 7th ed. New York: McGraw-Hill, Inc.

25. Nowlen, P.M. 1988. A New Approach to Continuing Education for Business and Professions. New York: Macmillan.

26. Sojourner, R.J., A.J. Aretz, and K.M. Vance, 1993. Teaching an Introductory Course in Human Factors Engineering: A Successful Learning Experience. In Proceedings of the Human Factors and Ergonomics Society 33rd Annual Meeting. Santa Monica, CA: Human Factors and Ergonomics Society.

27. Norman, D.A. 1988. The Design of Everyday Things. In To Err is Human. New York: Basic Books, Inc.

28. Casey, S. 1993. Set Phasers On Stun and Other True Tales of Design, Technology, and Human Error. Santa Barbara, CA: Aegean Publishing.

29. American Whitewater Safety Database. 1999. [database online] Lakewood, CO. Available from World Wide Web: www.awa.org/awa/safety/index.html.

30. Divers Alert Network (DAN). 1999. [online] Durham, NC. Available from World Wide Web: http://dan.ycg.org.

31. McDonald, C.J., 1976. Protocol-based Computer Reminders, The Quality of Care and the Non-perfectibility of Man. New England Journal of Medicine 295: 1351-5.

32. Louie, C. and A. Luber. 1996. Automated Drug Delivery Systems and the Potential for Medication Misadventures: A University’s Experience. New York: American Association for the Advancement of Science.

33. Cohen, M. 1999. Over-reliance on Pharmacy Computer Systems May Place Patients at Great Risk. ISMP Medication Safety Alert! 4(3): 1.

34. Thull, B., U. Janssens, G. Rau, and P. Hanrath, 1997. Approach to Computer-based Medication Planning and Coordination Support in Intensive Care Units. Technology in Health Care 5(3): 219-233.

35. Reason, J.T. 1990. Human Error. Cambridge, England: Cambridge University Press.

36. Riegelman, R.K. 1991. Minimizing Medical Mistakes: The Art of Medical Decision Making. Boston, MA: Little, Brown and Co.

37. "Personal Communication with Kathy Young." 1999. [electronic mail]. [cited February 1999].

38. Casarett, D., and C. Helms, 1999. Systems errors versus physicians' errors: Finding the balance in medical education. Academic Medicine; 74: 19-22.

39. J. Gosbee, 1998. Communication among health professionals: Human factors engineering can help make sense of the chaos. British Medical Journal ;316: 642.

40. International Ergonomics Association. 1999. [online]. Available from the World Wide Web: www.iea.org.

41. The Ergonomics Society. 1999. [online]. Available from the World Wide Web: www.ergonomics.org.uk.

 

Biography

Dr. John Gosbee, Director, Center of Applied Medical Informatics at Michigan State University Kalamazoo Center for Medical Studies, 1000 Oakland Dr., Kalamazoo, Michigan, 49008, USA. Telephone (616) 3370-4435. Email: gosbee@kcms.msu.edu. Web site: www.kcms.msu.edu/cami/camihome.html

John Gosbee, MD, MS has been involved in education, research, and development aspects of error in medicine and patient safety for eleven years. As an expert in human factors engineering and health care, he has provided leadership to interdisciplinary professional societies in these areas (e.g., Human Factors and Ergonomics Society), as well as consultation to government agencies (e.g., U. S. Food and Drug Administration), trade organizations (e.g., Canon Communications) and medical technology companies (e.g., Baxter Healthcare).