Skip to main content

Human Factors in Aviation

Using HFACS and the Swiss Cheese model, this study analyzes Germanwings 9525 and Shaw AFB F-16CM crashes, showing that beyond pilot error, latent supervisory and organizational failures played key roles. It urges expanding human-factor frameworks to encompass AI and new aviation tech.

· By Mathew Lewallen · 16 min read

Abstract

This paper uses human factor theoretical frameworks to evaluate its uses in aviation safety, focusing on two significant accidents: the Germanwings Flight 9525 crash and the Shaw Air Force Base F-16CM crash in 2020. The Human Factors Analysis and Classification System is used to analyze the human factor based causes of accidents, showcasing the complexity and dynamics of the human elements in accidents. Additionally, the Swiss Cheese model is used to visually represent the factors contributing to accidents within a complex systems. By examining official accident reports, it was revealed that the Germanwings Flight 9525 crash resulted from deliberate actions by the co-pilot, while the Shaw Air Force Base crash was attributed to errors during a training mission. Both accidents were deemed “pilot error,” but the frameworks identified human factors beyond pilot errors. This comprehensive analysis allows safety personnel and policymakers to address underlying safety issues. The research shows the necessity of enhancing the model to encompass emerging aviation technologies like Artificial Intelligence. The use of the Human Factors Analysis Classification System is successful in identifying human errors and the data from it is crucial to refining aviation safety measures and preventing similar mishaps in the future.

Keywords: aviation safety, human factors, pilot training, safety assessment

Human Factors in Aviation

Human factors are a challenge to aviation safety. Andrei et al. (2021) notes that “80% of aeronautical incidents are produced due to human factor errors” (67). A single factor, even a human factor, does not often cause an accidents. Instead, accidents are often caused by complex interactions between human and non-human factors. This is why it is essential to gather and understand human error data to improve aviation safety and prevent future mishaps.

Human Factors Analysis and Classification System (HFACS) is a framework to structure and comprehend the complex and dynamic nature of human errors in aviation. HFACS was created by Dr. Shappell and Dr. Wiegmann (2022), as an innovation to Mr. Reason’s Swiss Cheese Model. This model is a visualization of a stack of swiss cheese, the layers of cheese are the levels of the organization, the holes in the cheese are the problems, and the hole all the way through is what enables the incident.

The HFACS model provides a more comprehensive version of the Swiss Cheese model through its categorization of human error, with; unsafe acts, precondition for unsafe acts, unsafe supervisions, and organizational influences (Wiegmann et al., 2022). Unsafe acts are set as the first level and is divided into errors or violations. Next, preconditions for unsafe acts are conditions that are increasing the odds of allowing the unsafe acts. Unsafe supervision is improper oversight that enables, or does not prevent, the accident from happening. Finally, organizational influences are cultures, policies, or procedures enabling the accident to occur (Wu et al., 2023).

This research paper explores human factor models in aviation, using examples from a military and a civilian accidents over the past decade. The first accident is the civilian Germanwings Flight 9525 crash in 2015, as reported by the Bureau d’Enquetes et d’Analyses (BEA) final accident report (2016). The other accident is military with the Shaw Air Force Base F-16CM crash in 2020, reported in the United States Air Force (USAF) Aircraft Accident Investigation Board Report (2020).

Literature Review

Literature from scholarly sources and official aviation accident reports were reviewed to reveal common human factors that contribute to aviation accidents. Variables such as pilot error, organizational culture, and technological factors were key contributors to both of the accidents. With the limited number of accidents investigated, there are potential outliers that may include unique environmental conditions, such as flying at night, or system failures that require specific attention in aviation safety protocols. As noted in the literature, "Accidents should be viewed as control issues rather than failure issues, and they may be prevented by setting limitations on component behaviors and interactions" (Wu et al., 2023, 2).

HFACS and Swiss Cheese Model

There are various frameworks that work alongside the Swiss Cheese model, most notably is the HFACS framework (Fig. 1). The HFACS framework is depicted as a hierarchy system within the levels of human error.

The first level in the HFACS hierarchy is unsafe acts. This layer is the actions of individuals that directly influence the unsafe situation. Unsafe acts can be further divided into errors and violations. Errors are made on decisions, skills, and perception. Violations are deviations from standards and regulations.

The next level of HFACS is the preconditions for unsafe acts. Preconditions include environmental factors, personnel factors, and conditions of operator’s mental or physical state. Environmental factors can be physical or technological. Personnel factors include crew resource management and personal readiness. “Holes that occur at the Unsafe Acts level, and even some at the Preconditions level, represent active failures... they are seen as being actively involved or directly linked to the bad outcome" (Wiegmann et al., 2022, 120).

Next is unsafe supervision, which reviews the role of supervisor’s general practices and their decisions leading to the accident. Unsafe supervision can be divided further into four categories; inadequate supervision, planned inappropriate operations, failure to correct a problem, and supervisory violations (Wiegmann et al., 2022).

Last is organizational influences, which is the broadest level on HFACS. This layer evaluates the higher organizational management and systematic factors within the organization. Key factors within organizational influences are resource management, organizational climate, and organizational processes. "Latent failures occur higher up in the system (unsafe supervision and organizational influences)... They are referred to as 'latent' because when they occur or open, they often go undetected” (Wiegmann et al., 2022, 120).

Figure 1

The HFACS Framework

The Human Factors Analysis and Classification System Framework
The Human Factors Analysis and Classification System Framework

Note. The HFACS framework depicts four levels of failure with causal categories of active and latent failures that occur (skybrary, n.d.).

Additionally, the Swiss Cheese model is a theory used to understand how accidents occur within a complex system. It was created by Reason on the Theory of Active and Latent Failures. This theory explains and helps visually depict how accidents occur from small mistakes lining up perfectly to culminate in an accident. It is illustrated as multiple slices of Swiss cheese being stacked together. Each slice of cheese is a portion of the complex system and each hole in the cheese represents a failure within the system. It is unlikely that there will be a hole all the way through the combined slices of cheese, but it can happen. Similarly, it is unlikely that there will be a consistent problem throughout a complex system that does not get fixed or stopped, but it is also not impossible.

Shaw Air Force Base F-16CM Accident

The final report from the USAF (2020) Aircraft Accident Investigation Board is a comprehensive review of the events leading to the crash. The investigation reports layers of human factors that contributed to the event. This literature review summarizes the findings from the report, with an emphasis on human factors. The USAF (2020) general report of the incident is that:

On 30 June 2020, the mishap pilot, flying F-16CM tail number 94-0043, assigned to the 77th Fighter Squadron, 20th Fighter Wing, Shaw Air Force Base, South Carolina, engaged in a night mission qualification training flight near Shaw Air Force Base (AFB). During the recovery and landing phase of the mission, at approximately 2226 local time, the mishap aircraft’s landing gear was damaged in an initial landing attempt at Shaw Air Force Base. In a subsequent landing attempt, at approximately 2259 local time, the mishap aircraft departed the runway, and the mishap pilot was fatally injured during an unsuccessful ejection (i)

Events leading up to the crash had multiple active and latent human errors. The mishap pilot was conducting his first ever air-to-air refueling mission at night. Usually, F-16 pilots conduct air-to-air refueling in daytime at a training course. This pilot was unable to accomplish this training, so that responsibility then fell on the fighter wing instructors. However, against regulation, the mishap pilot was ordered to conduct night-time air-to-air refueling for his mission and failed. This caused the pilot and flight lead aircraft to return to base early due to insufficient fuel to complete the mission. During the return, the mishap pilot was reported to have been aggravated with himself and experience subsequent distractions (USAF, 2020).

During the pilots initial landing, short final, he mistook the 1000-foot light bar for the runway threshold. This led to a short landing and the aircraft striking the localizer antenna, damaging the aircraft’s left main landing gear. The pilot initiated a go-around and rejoined to the flight lead. The aircraft had time to discuss potential courses of action between his flight lead, supervisor of flying, and other leadership before deciding on an action (USAF, 2020).

The supervisor of flying reported two options, a controlled ejection, or an approach-end cable engagement. The decision was made for the cable engagement, which, in hindsight, was the incorrect decision based on procedures and regulations. The report noted that the supervisor of flying should have consulted a checklist that would have told him to contact an on-call Lockheed Martin Engineer. The engineer’s team reported after the accident that they would have recommended a controlled ejection based on prior similar incidents (USAF, 2020).

The mishap pilot was unsuccessful in engaging the cable, the aircraft left wing lowered to the ground, and the pilot tried to abort. The mishap pilot was unable to abort, so he ejected. A critical failure occurred in the ejection seat causing the parachute not to deploy and the mishap pilot to impact the ground while still in the ejection seat (USAF, 2020).

Germanwings Flight 9525 Accident

The final report from the Bureau d’Enquetes et d’Analyses (BEA) dissected the errors that led up to the crash of flight 9525 (2016). The copilot received a class 1 medical certificate without restrictions in 2008. After the depressive episode delayed his yearly medical renewal until he was given an endorsement with a waiver for special conditions and restrictions. From 2009 to 2014, the copilot had exams from aeromedical examiners aware of his depressive episode. However, no examiner ever required further psychiatrist assessment. Between his first waiver and the crash, there were no official aeromedical mandated psychiatrist or psychologist involvement with the copilot. However, a private physician referred the copilot to a specialist for possible psychosis two weeks prior to the crash. From this, the copilot was prescribed anti-depressant and sleep aid medication during the crash. The private physician’s diagnosis was not communicated beyond the patient, resulting in no aviation authority being advised of the medical condition (BEA, 2016).

The aircraft flew from Düsseldorf to Barcelona the morning of the crash. During which, the copilot simulated a 100 ft descent while alone in the cockpit. During the flight leading to the crash, the captain left the cockpit while the aircraft was at 38,000 feet. While alone, the copilot changed the altitude setting to 100 ft, starting a descent. Air traffic and air defense realized the abnormality and tried to reach the aircraft without a response. The captain and the crew tried to get inside the cockpit and get an answer from the copilot during the descent. The aircraft continued descent until it collided with terrain (BEA, 2016).

The BEA has listed the following causes in its official report (2016):

The collision with the ground was due to the deliberate and planned action of the co-pilot who decided to commit suicide while alone in the cockpit. The process for medical certification of pilots, in particular self-reporting in case of decrease in medical fitness between two periodic medical evaluations, failed in preventing the co-pilot, who was experiencing mental disorder with psychotic symptoms, from exercising the privilege of his license.

The following factors may have contributed to the failure of this principle: the co-pilot’s probable fear of losing his ability to fly as a professional pilot if he had reported his decrease in medical fitness to an AME; the potential financial consequences generated by the lack of specific insurance covering the risks of loss of income in case of unfitness to fly; the lack of clear guidelines in German regulations on when a threat to public safety outweighs the requirements of medical confidentiality.

Security requirements led to cockpit doors designed to resist forcible intrusion by unauthorized persons. This made it impossible to enter the flight compartment before the aircraft impacted the terrain in the French Alps.

Methodology

Qualitative data was collected through scholarly literature on human factors, including a comprehensive review of the two aviation accident reports. A full review of these failures in aviation would require analysis through case studies, accident investigations, and simulations. In this research, no simulations were conducted. However, data was used from the reported simulation results from the investigations.

After the data was gathered, then it was reviewed for recurring themes and patterns related to human errors. The crash analysis takes each error and categorizes it into its corresponding layer of the HFACS framework. Finally, a discussion of the implications was provided to offer a final, comprehensive understanding of human factors failures in the accidents. Limitations exist within this methodology because it scrutinizes actions in retrospect and some evidence may be subjective and underdeveloped.

Accident Analysis

Using the HFACS framework, the Germanwings Flight 9525 accident had its human factor errors categorized. The unsafe act is the copilots deliberate action to descend the aircraft into terrain. This is an active failure that directly contributed to the accident.

Next, the preconditions for the unsafe act are the copilot’s mental health and the inability to access the locked cabin. The copilots mental unwellness was diagnosed and medicine was prescribed. The fact that it was known and not communicated to authorities was a significant factor enabling the unsafe act. Additionally, the inability to access the locked cabin by the captain was a precondition that allowed the unsafe act to happen.

Unsafe supervision was another layer in the system that failed. The copilot’s mental health issues not being adequately monitored since the failed medical examination in 2008. And not being reexamined at any point by an aeromedical examiner for mental health was a failure on the supervisory layer of the framework.

Finally, the organizational influences are the airline’s policy on pilot’s mental health. The organizations lack of proactive health checks were another failure that inadvertently led to the crash. All these human factor errors happened without intervention, lining up holes in the system and causing the crash.

Due to the Germanwings Flight 9525 crash, there is a reality of issues that need to be addressed. Organizations need to improve their mental health screening and mental support for members in the aviation industry. They also need to implement policies and practices for health reporting and re-evaluations. Wu et al. (2016) has shown the prevalence of mental health issues in the aviation community, so this is not just a one-off case.

With the HFACS framework, an analysis of the Shaw AFB F-16CM accident was accomplished. It was determined that the event was caused all, or in part, from the following factors: procedure not followed correctly, environmental conditions affecting vision, distraction, and supervisory/command oversight inadequate.

The unsafe act was that the procedure was not flown correctly, the mishap pilot steepened his descent angle prematurely (USAF, 2020). The pilot was distracted, focusing on his failure to complete his night-time air-to-air refueling mission. This distracted him from his current operational objective of landing the aircraft. The distraction in this incident shows psychological human factors on skill performance (USAF, 2020). As Wu et al. (2023) suggest, "Accidents should be viewed as control issues rather than failure issues, and they may be prevented by setting limitations on component behaviors and interactions” (2). This highlights the need for training in night-time operations and instrument conditions to better control the situation.

The precondition for the unsafe act was having the mishap pilot flying a night-time air-to-air refueling mission without proper training. The environmental condition of flying at night affected the pilots vision by reducing visibility and causing reliance on aircraft instruments. The night conditions increased the complexity of air-to-air refueling and the aircraft landing (USAF, 2020).

There was also a failure on the supervisor of flying and the mishap pilot’s leadership. Leadership approved the night sortie without prior completion of similar tasks, such as air-to-air refueling in day-time, putting the mishap pilot in the position to fail. The supervisor of flying did not follow regulation, which led to a supervisory oversight and the failure to land the aircraft safely (USAF, 2020).

The final HFACS framework shows that the mishap had causes at each layer of the model. Unsafe acts on the pilot, preconditions for unsafe acts from the night sortie, unsafe supervision from the supervisor of flying, and organizational influences from the mishap pilot’s fighter wing. Using Mr. Reason’s Swiss Cheese Model, a visual can be made that the holes in the layers of cheese (failures) lined up to create a pathway through (accident).

Implications

The research is meant to shed light on human factors that can led to accidents. Aviators and safety specialists can take this information and help prevent further incidents. Individuals and organizations with authority can change safety protocols, take proactive measures, and implement safety audits to avoid duplicating the events. However, such failures can also have negative implications on several stakeholders. It shines a light on failures within the system that reflect poorly on companies, people, and even states. Nevertheless, these failures need to be addressed.

There are significant implications for aviation safety professionals, policymakers, and stakeholders. Stakeholders must take the data from the accidents and implement changes to create a proactive approach to safety and to prevent future accidents.

The HFACS model’s work in these two case studies would point out several implications for stakeholders in aviation safety. The identified unsafe acts and preconditions for unsafe acts show the need for improvement in F-16 pilot training and accountability. There were mistakes that the pilot made that show that there was a deficit in skill, decision making, and situational awareness that additional training could have prevented. The training structure was also a failure that could be improved to ensure future students do not experience the same preconditions for unsafe acts.

There were also failures in leadership and management as unsafe supervision and organizational influence factors in the HFACS model. Leadership made decisions that bypassed safety features in regulations creating an unsafe situation. The supervisor of flying also made decisions that went against how the emergency checklist should have been used. This shows that there needs to be additional training or more clear guidance on operating the checklists. Wu et al., (2023) suggests an understanding of the systems can create more extensive policies and practices, writing "the usefulness of the system dynamics models depends on its ability to provide endogenous explanations of observed behavior and support policy design" (5). Showing the complexity of human factors in HFACS allows for systematic changes that could prevent future human errors and aviation accidents.

Implications from the Germanwings crash have also emerged since 2015. One of which is a Pilot Mental Health Working Group from the Aerospace Medical Association that is aimed to “improve awareness and identification of pilot mental health issues during the aeromedical assessment of pilots” (Anzalone et al., 2016, 505). Results of this group are “successful approaches that improve rates of reporting, discussion, and participation” (506) with Project Wingman, Air Line Pilots Association Human Intervention Motivation Study, and Delta Airlines Pilot Assistance Network (Anzalone et al., 2016).

Unsafe supervision was a layer in the system that failed in the Germanwings crash. The copilot’s mental health issues not being adequately monitored since the failed medical examination in 2008. Not being reexamined at any point by an aeromedical examiner for mental health was a failure on the supervisory portion of the system. Also the organizational influences of the airline’s policy on pilot health. The organizations lacked proactive health checks, which inadvertently led to the crash. All these human factor failures happened without intervention, lining up holes in the system causing the crash.

Due to the Germanwings Flight 9525 crash, there is a reality of issues that need to be addressed. Organizations need to improve their mental health screening and mental support for members in the aviation industry. They also need to implement policies and practices for health reporting and re-evaluations. Wu et al. (2016) has shown the prevalence of mental health issues in the aviation community, so this is not just a one-off case.

Future Research

Future research should explore gaps within the current HFACS model and its application to aviation safety. There is growth of technology and artificial intelligence in aviation that may present a new layer of the HFACS model. Not necessarily technology and artificial intelligence itself, but the human interaction with it. Andrei et al. (2021) explains, “This 'human-machine' interaction must be designed in such a way that the operation of the machine is based on human characteristics, in order to limit potential error” (68). A projected problem is human complacency or overconfidence from automation. Implications from this new layer could potentially show a trend toward the need for incorporating additional technology and artificial intelligence training programs.

Conclusion

Human factors in aviation are a complex and dynamic system that requires a systematic and comprehensive review to fully understand. The HFACS and Swiss Cheese model are frameworks that allow researchers to break down the human factor based causes of accidents and categorize them.

The reports from the BEA (2016) and USAF (2020) determined the Germanwings Flight 9525 was a deliberate action from the co-pilot, and the Shaw AFB crash was a series of errors in a training mission. Further evaluation with HFACS determined more than pilot errors and faults. There was also inadequate training preconditions, and organizational influences with inappropriate safety cultures. The HFACS framework allowed for a complex analysis of human factors within the two accidents. This can further be used by safety personnel, aviation policymakers, and other stakeholders to proactively address issues. Future research should continue developing the HFACS model by expanding it to include emerging technologies, such as AI.

 

References

Andrei, A.-G., Balasai, R., Costea, M.-L., & Semenescu, A. (2021). Overview Regarding Human Factors in Aviation. Annals of the Academy of Romanian Scientist Series on Engineering Sciences, 13(1), 67-76.

Anzalone F., Belland K., Bettes T., Beven G., Bor R., Damann V., Dillinger T., Dowdall N., Evans A., Evans S., Flynn C., Fonne V., Front C., Gonzalez C., Hastings J., Herbert K., Chun C., Hudson M., King R., Lange M., … (2016). Pilot mental health: Expert working group recommendations - revised 2015. Aerospace Medicine and Human Performance, 87(5), 505-507. https://doi.org/10.3357/AMHP.4568.2016

Bureau d’Enquetes et d’Analyses (BEA) (2016). Final Report: Accident on 24 March 2015 at Prads-Hate-Bleone (Alpes-de-Haute-Provence, France) to the Airbus A320-211 registered D-AIPX operated by Germanwings. Retrieved from https://bea.aero/uploads/tx_elydbrapports/BEA2015-0125.en-LR.pdf.

Georgiadou, N., & Kakarelidis, G. (2020). Greece flight 9525 crash: Balance between the rights of medical confidentiality and the general right of public safety under Greek law. European Data Protection Law Review (Internet), 6(4), 560-566. https://doi.org/10.21552/edpl/2020/4/12

Skybrary. (n.d.). The HFACS Framework [diagram]. Skybrary. https://skybrary.aero/articles/human-factors-analysis-and-classification-system-hfacs

Lewallen, M. (2024). Human Factors in F-16CM Crash. Unpublished manuscript. Liberty University.

Lewallen, M. (2024). Human Factors in Germanwings Flight 9525. Unpublished manuscript. Liberty University.

United States Air Force. (2020). Aircraft Accident Investigation Board Report: F-16CM, T/N 94-0043. Air and Space Forces Magazine. Retrieved from https://www.airandspaceforces.com/app/uploads/2020/11/F-16-Mishap-AIB-30-June-2020-Shaw-AFB-ACC.pdf.

Wiegmann, D. A., Wood, L. J., Cohen, T. N., & Shappell, S. A. (2022). Understanding the "Swiss cheese model" and its application to patient safety. Journal of Patient Safety, 18(2), 119-123. https://doi.org/10.1097/PTS.0000000000000810.

Wu, A. C., Donnelly-McLay, D., Weisskopf, M. G., McNeely, E., Betancourt, T. S., & Allen, J. G. (2016). Airplane pilot mental health and suicidal thoughts: A cross-sectional descriptive study via anonymous web-based survey. Environmental Health, 15(1), 121-121. https://doi.org/10.1186/s12940-016-0200-6

Wu, Y., Zhang, S., Zhang, X., Lu, Y., & Xiong, Z. (2023). Analysis on coupling dynamic effect of human errors in aviation safety. Accident Analysis and Prevention, 192, 107277-107277. https://doi.org/10.1016/j.aap.2023.107277.

About the author

Mathew Lewallen Mathew Lewallen
Updated on Jun 30, 2025