Abstract
The Germanwings Flight 9525 crash occurred on March 24, 2015, revealing how critical human errors can be to aviation safety. Through analysis of scholarly literature on the incident, this research paper investigates human-based failures that led to the event. The theoretical frameworks used to understand human factors implications are the Human Factors Analysis and Classification System and the Swiss Cheese Model. The method of conducting this research was gathering qualitative data from peer-reviewed articles on the events and analyzing official reports from aviation investigative agencies. Previous research into the event showed the copilot was mentally unwell at two points in his career, both times human intervention had the potential to stop future events. In 2008, the copilot was diagnosed with depression but was able to get back on flying status with a waiver. Since that event, no psychiatric wellness checks were performed. The co-pilot did, however, seek a private doctor weeks before the crash, where he was again diagnosed with depression, but this time medication was prescribed. This information was not distributed to aviation authorities and the copilot was allowed to remain on flying status.
Keywords: aviation safety, human factors analysis, pilot mental health, systematic safety failures
Human Factors in Germanwings Flight 9525
This research paper analyzes human-based failures that were instrumental in the 2015 Germanwings Flight 9525 crash. Delving into the situation and evaluating human causes provides context for the theoretical frameworks being presented; the Swiss Cheese Model and the Human Factors Analysis and Classification System (HFACS).
On March 24, 2015, a copilot, who suffered from mental health issues, took control of the cockpit after the captain had left momentarily. Shortly after, the copilot intentionally crashed the plane, killing all 150 people on board. In this incident, the copilots’ depressive symptoms and suicidal or morbid ideations led to the crash. Wu et al. (2016) describes the copilot as having “Previous suicide attempts and having a history of mental disorders, particularly clinical depression, are risk factors of suicide” (p. 2). This incident is shown to be prevalent in future research conducted by Wu et al. (2016), showing:
1,837 (52.7%) of the 3485 surveyed pilots completed the survey, with 1866 (53.5%) completing at least half of the survey. 233 (12.6%) of 1848 airline pilots responding to the Patient Health Questionnaire 9 (PHQ-9), and 193 (13.5%) of 1430 pilots who reported working as an airline pilot in the last seven days at the time of the survey, met the depression threshold–PHQ-9 total score ≥ 10. Seventy-five participants (4.1%) reported having suicidal thoughts within the past two weeks. We found a significant trend in proportions of depression at higher levels of use of sleep-aid medication (trend test z = 6.74, p < 0.001) and among those experiencing sexual harassment (z = 3.18, p = 0.001) or verbal harassment (z = 6.13, p < 0.001).
Theoretical Framework
This Swiss Cheese Model is a theory used to understand how accidents occur within a complex system, which often has fail safes for preventing accidents. It was created by Dr. Reason on the premise of the Theory of Active and Latent Failures. This theory explains that accidents occur from small mistakes lining up perfectly to culminate in an accident. The Swiss Cheese model is an easier visual depiction of this. Illustrated as multiple slices of Swiss Cheese stacked together. Each slice of cheese is a portion of the complex system and each hole in the cheese represents a small mistake or failure within the system. It is unlikely that there will be a hole through the combined slices of cheese, but it is possible. Similarly, it is unlikely that there will be consistent mistakes or problems throughout a complex system that does not get fixed or stopped, but it is possible (Wiegmann et al, 2022).
In Dr. Reason’s (1991) book, Human Error, he describes four levels within a complex system that can fail. “These levels can best be described as Unsafe Acts, Preconditions for Unsafe Acts, Supervisory Factors, and Organizational Influences” (Wiegmann et al., 2022, p. 119). Each level is a dynamic condition that must fail to continue to the next. Additionally, each hole or failure can be categorized as an active or latent failure. Active means that immediately addressing the issue can stop the failure. Latent means that it is a systematic failure that requires organizational influences and supervisor factors to change or fix the failure. Active failures often require a reactive response, while latent failures need to be addressed proactively (Wiegmann et al., 2022).
There are various safety and root cause analysis frameworks that work alongside the Swiss Cheese model, most notable is the HFACS framework (Fig. 1). HFACS comes from Dr. Shappell and Dr. Wiegmann (2022) to enhance Mr. Reason’s model to show the hierarchy of relationships between factors (Wu et al., 2023).
Figure 1
The HFACS Framework
Note. The HFACS framework depicts four levels of failure with causal categories of active and latent failures that occur (Skybrary, n.d.).
The HFACS framework creates a hierarchy within the levels of human error. Unsafe acts, at the bottom, consist of errors or violations. Then, preconditions for unsafe acts set the conditions that allow for or cause the unsafe actions to occur. Unsafe supervision is the lack of proper oversight, which prevents the ability to catch an error or fails to correct a failure that takes place. Finally, organizational influences are at the highest level, which creates cultures, policies, or procedures within the organization that enable accidents to happen (Wu et al., 2023).
Literature Review
The final report from the Bureau d’Enquetes et d’Analyses (BEA) dissected the errors that led up to the crash of Germanwings Flight 9525 (2016). The co-pilot in the incident had a depressive episode and sought medical treatment. During his 2008 yearly medical renewal, he was given an endorsement with a waiver for special conditions and restrictions, resulting in a class 1 medical certificate. From 2009 to 2014, the co-pilot’s medical certificate was regularly renewed by aeromedical examiners who were aware of his depressive episode but never required further psychiatric assessments. Between his first waiver and the crash, there were no official aeromedical mandated psychiatrist or psychologist involvement with the copilot. However, a private physician referred the copilot to a specialist for possible psychosis two weeks before the crash. From this, the copilot was prescribed anti-depressant and sleep aid medication, which he was actively taking during the crash. The private physician’s diagnosis was not communicated beyond the patient, resulting in no aviation authority being advised of his medical condition (BEA, 2016).
The aircraft was flown from Düsseldorf to Barcelona in the morning prior to the crash. During this, the copilot was allowed to be alone in the cockpit and he simulated a descent to 100 ft. During the flight that crashed, the captain left the cockpit while the aircraft was at 38,000 feet. While alone, the copilot changed the altitude setting to 100 ft, starting a descent. Air traffic and Air Defense authorities realized the abnormality and tried to reach the aircraft without a response. The captain and crew had been locked out of the cockpit, they tried to get inside and elicit an answer from the copilot during the descent. The aircraft continued its descent until it collided with terrain (BEA, 2016).
According to the BEA (2016), the co-pilot intentionally crashed the plane due to mental health issues. The current medical certification process for pilots failed to prevent this due to fear of losing their license and financial consequences, along with unclear guidelines for reporting. Additionally, cockpit security measures made it impossible to intervene before the crash.
Methodology
The research methodology analyzes human factors in the complex system of aviation to evaluate the Germanwings Flight 9525 crash. The case studied has a comprehensive accident report that involves multiple layers of human factor errors leading to the accident.
The theoretical frameworks that the case is evaluated against are the HFACS framework and the hierarchy of human error involved. Also, the Swiss Cheese Model is used as an addition to visually represent the scenario. The qualitative data given by literature is analyzed to give detailed reports of the incident and to identify human factors.
The limitations to using the HFACS framework, Swiss Cheese Model, and human error hierarchy systems are that details are reviewed in retrospect. Projecting outcomes from changing factors leading to the accident creates hypothetical solutions that cannot be guaranteed.
Implications
The research within this paper is meant to shed light on human factors that can cause devastating incidents. Aviators and safety specialists can take this information and help prevent further incidents. Individuals and organizations with authority can change safety protocols, take proactive measures, and implement safety audits to avoid duplicating the events. However, such failures can also have negative implications for stakeholders. It shines a light on failures within the system that reflect poorly on companies, people, and states. Nevertheless, these failures need to be addressed.
Implications of the crash itself have materialized since 2015. One of which is a Pilot Mental Health Working Group from the Aerospace Medical Association that aims to “improve awareness and identification of pilot mental health issues during the aeromedical assessment of pilots” (Anzalone et al., 2016, p. 506). The results of this group are “successful approaches that improve rates of reporting, discussion, and participation” (p. 506) with Project Wingman, Air Line Pilots Association Human Intervention Motivation Study, and Delta Airlines Pilot Assistance Network (Anzalone et al., 2016).
Results
The following results used the HFACS framework to show failures within the system, specifically adhering to the four components: Unsafe Acts, Preconditions for Unsafe Acts, Unsafe Supervision, and Organizational Influences.
Next, the Preconditions for the Unsafe Act are the co-pilot’s mental health and the inability to access the locked cabin. The co-pilot’s mental unwellness was diagnosed multiple times throughout his career and medicine was prescribed just before the crash. The fact that it was known and not communicated to authorities was a significant factor enabling the unsafe act. Additionally, the inability to access the locked cabin by the captain was a precondition that allowed the unsafe act to happen.
Unsafe Supervision was the next layer in the system that failed. The copilot’s mental health issues were not adequately monitored since the failed medical examination in 2008. Not being reexamined at any point by an aeromedical examiner for mental health was a failure on the supervisory portion of the system.
Finally, the Organizational Influence affecting this crash is the airline’s policy on pilot health. The organization’s lack of proactive health checks was a failure that inadvertently led to the crash. All these human factor failures happened without intervention, lining up holes in the system and causing the crash.
Due to the Germanwings Flight 9525 crash, there is a reality of issues that need to be addressed. Organizations need to improve their mental health screening and mental support for members of the aviation industry. They also need to implement policies and practices for health reporting and re-evaluations. Wu et al. (2016) presents the prevalence of mental health issues in the aviation community, so this is not an isolated case.
Future Research
Future research must analyze other incidents outside of the Germanwings Flight 9525 crash. Expanding on the research will reveal trends in human factor errors. It may also show the impact of increased mental health support if analysis is done after implementing new procedures. There should also be further investigations done on the identified human errors from other disciplines, such as psychologists, safety experts, and human factors experts.
The current areas of interest for future research and safety recommendations in this case from the BEA (2016) are: “medical evaluation of pilots with mental health issues”; “routine analysis of in-flight incapacitation”; “mitigation of the consequences of loss of license”; “anti-depressant medication and flying status”; “balance between medical confidentiality and public safety”; and “promotion of pilot support programs” (pp. 98-103).
Conclusion
Germanwings Flight 9525 is a terrible incident that required an extensive investigation to show the complex nature of the aviation industry and how human factors affect safety within it. Analyzing the crash through the HFACS framework and the Swiss Cheese model shows how human factor failures without intervention can be compounded and lead to accidents, such as this crash.
The findings revealed unsafe acts from the co-pilot's caused the accident. His mental health and inability to access the locked cabin were preconditions that allowed the unsafe act to happen. Furthermore, inadequate monitoring of the co-pilot's mental health and the airline's policy on pilot health were unsafe supervision and organizational failures that led to the crash.
This research has vast implications on the aviation community and stakeholders. The aim is to show that there is a problem that needs to be solved and to label human factors as a main concern of the problem. Future research should take a closer look at these failures by expanding the research to other incidents involving mental health oversights and human error.
References
Bureau d’Enquetes et d’Analyses (BEA) (2016). Final Report: Accident on 24 March 2015 at Prads-Hate-Bleone (Alpes-de-Haute-Provence, France) to the Airbus A320-211 registered D-AIPX operated by Germanwings. Retrieved from https://bea.aero/uploads/tx_elydbrapports/BEA2015-0125.en-LR.pdf.
Anzalone F., Belland K., Bettes T., Beven G., Bor R., Damann V., Dillinger T., Dowdall N., Evans A., Evans S., Flynn C., Fonne V., Front C., Gonzalez C., Hastings J., Herbert K., Chun C., Hudson M., King R., Lange M., … (2016). Pilot mental health: Expert working group recommendations - revised 2015. Aerospace Medicine and Human Performance, 87(5), 505-507. https://doi.org/10.3357/AMHP.4568.2016
Georgiadou, N., & Kakarelidis, G. (2020). Greece flight 9525 crash: Balance between the rights of medical confidentiality and the general right of public safety under Greek law. European Data Protection Law Review (Internet), 6(4), 560-566. https://doi.org/10.21552/edpl/2020/4/12
Reason, J. (1991). Human error. Cambridge Univ. Pr.
Skybrary. (n.d.). The HFACS Framework [diagram]. Skybrary. https://skybrary.aero/articles/human-factors-analysis-and-classification-system-hfacs
Wiegmann, D. A., Wood, L. J., Cohen, T. N., & Shappell, S. A. (2022). Understanding the "swiss cheese model" and its application to patient safety. Journal of Patient Safety, 18(2), 119-123. https://doi.org/10.1097/PTS.0000000000000810
Wu, A. C., Donnelly-McLay, D., Weisskopf, M. G., McNeely, E., Betancourt, T. S., & Allen, J. G. (2016). Airplane pilot mental health and suicidal thoughts: A cross-sectional descriptive study via anonymous web-based survey. Environmental Health, 15(1), 121-121. https://doi.org/10.1186/s12940-016-0200-6
Wu, Y., Zhang, S., Zhang, X., Lu, Y., & Xiong, Z. (2023). Analysis on coupling dynamic effect of human errors in aviation safety. Accident Analysis and Prevention, 192, 107277-107277. https://doi.org/10.1016/j.aap.2023.107277