Two fully occupied jumbo jets are on the runway in dense fog—one preparing to taxi to the terminal, the other readying for takeoff. In the cockpit, the first officer echoes the latest radio message, saying, “We are now at takeoff,” although he could have said “We are taking off.” The captain, impatient, interrupts: “We're going,” releases the brakes, and accelerates the plane down the runway. Twenty seconds later, the flight engineer asks, “So, he hasn't left there yet?” The captain replies, “What do you say?” and the engineer repeats, “Hasn't he left there yet, this Pan American?” The captain, brusque, responds, “Of course.” Fourteen seconds later, the cockpit voice recorder captures the devastating sounds of impact. This tragic collision would claim 583 lives in what remains one of the deadliest accidents in civil aviation history—the Tenerife disaster.
The Tenerife tragedy, like many aviation catastrophes, revealed critical issues rooted in strict hierarchies, flawed decision-making, and miscommunication. These so-called “soft skills”—team leadership, decision-making, and situational awareness—were profoundly lacking, leading to misinterpretations and fatal mistakes. Other tragic examples, such as the 1978 crash of United Airlines Flight 173 or the 2009 crash of Air France Flight 447, show similar patterns where the flight crews were unaware of urgent problems, with fatal consequences. In each of these cases, communication breakdowns, misjudgments, and a failure to heed warning signs were underlying causes. Such failures extend beyond aviation to space exploration. While the 1986 U.S. space shuttle Challenger disaster ostensibly resulted from defective O-rings, a deeper investigation revealed critical issues in organizational communication, safety standard deviations, and a failure to escalate engineering concerns.[1] National Aeronautics and Space Administration (NASA) managers dismissed engineers' repeated warnings, with one senior engineer predicting the previous year that “total element failure…could cause loss of life or vehicle.”[2] During discussions the night before the final launch, engineers were unable to clearly convey their concerns and were ultimately disregarded by NASA leadership.[1]
Human error and in-flight loss of control (LOC) have historically been leading causes of fatal accidents, according to various commercial aircraft accident statistics.[3] [4] [5] In response to a series of catastrophic accidents attributed to human error, NASA held its first workshop in 1979, titled “Resource Management on the Flight Deck,” which underscored the critical connection between aviation accidents and failures in interpersonal communication, decision-making, and leadership within flight crews.[6] This workshop laid the foundation for modern Crew Resource Management (CRM), a strategy designed to minimize human error and enhance teamwork and communication in complex, high-risk environments. Since then, CRM has become a standard in both civil and military aviation, though its development and implementation have varied by country and regulatory authority. In the United States, CRM is regulated by the Federal Aviation Administration (FAA), while global standards are set by international organizations such as the International Civil Aviation Organization (ICAO).[7] [8] Today, CRM training is an integral part of preparation for flight crew members and other safety-critical personnel, with guidelines such as FAA Advisory Circulars prescribing its implementation. The adoption of CRM has contributed significantly to reducing accidents rooted in human error, though its precise impact on lives saved or aircraft preserved may never be fully quantified.[4] [8] [9] [10]
Authors' ContributionsM.A.K. and N.D. had the research idea. M.A.K., C.v.S., and N.D. contributed to study conception and design. Data collection and literature review were performed by M.A.K., N.D., C.v.S., F.M., and C.S. M.A.K. wrote the first draft. M.A.K. and N.D. reviewed and edited the draft to the final manuscript. N.D. supervised the work. All authors read and approved the final manuscript.
AI-assisted technology was neither used for the generation, evaluation, or interpretation of the data presented in the manuscript, nor for the creation of text, figures, or tables. AI-based tools (ChatGPT) may have been used to improve language and text readability.
Not applicable.
Not applicable.
Publication HistoryReceived: 22 March 2025
Accepted: 16 April 2025
Article published online:
26 June 2025
© 2025. Thieme. All rights reserved.
Georg Thieme Verlag KG
Oswald-Hesse-Straße 50, 70469 Stuttgart, Germany
Comments (0)