Abstract
Human factors in aviation are complex and dynamic systems that require a systematic and comprehensive review to be understood fully. The Human Factors Analysis and Classification System model within this research paper is a framework that allows researchers to break down the causes of accidents within multiple layers of human factors. The Swiss Cheese Model visually represents these human factors that add up to an accident within a complex system. Reports from the Bureau d’Enquetes et d’Analyses and the United States Air Force determined that Germanwings Flight 9525 was a deliberate action by the co-pilot, and the Shaw Air Force Base crash was a series of errors in a training mission. Further evaluation through the Human Factors Analysis and Classification System model determined factors contributed outside of pilot errors and faults. There were also inadequate training preconditions and organizational influences with inappropriate safety cultures. The framework allowed for a complex analysis of human factors in the two accidents. Safety personnel, aviation policymakers, and other stakeholders can use this to address issues proactively. Future research should continue the model by expanding it to include emerging technologies, such as Artificial Intelligence.
Keywords: aviation safety, human factors, pilot training, safety assessment
Human Factors Implications in the Future
Human factors are a challenge to aviation safety. Andrei et al. (2021) notes that “80% of aeronautical incidents are produced due to human factor errors” (p. 67). A single factor, even a human factor, does not often cause accidents. Instead, accidents are often caused by complex interactions between human and non-human factors; in human factors, it’s the individuals involved and their environment. This is why it is essential to understand human error to improve aviation safety and prevent future mishaps. This research paper explores the future of aviation in the context of human factor models, drawing parallels from two specific accidents over the past decade. The first accident is the Germanwings Flight 9525 crash in 2015, as reported by the Bureau d’Enquetes et d’Analyses (BEA; 2016) final accident report. The other accident was the Shaw Air Force Base (AFB) F-16CM crash in 2020, reported in the United States Air Force (USAF; 2020) Aircraft Accident Investigation Board (AIB) Report.
Human Factors Analysis and Classification System (HFACS) is a theoretical framework used to structure and comprehend the complex and dynamic nature of human errors in aviation. HFACS was created by Dr. Shappell and Dr. Wiegmann (2022), as an innovation to Dr. Reason’s Swiss Cheese Model (SCM).
The HFACS model provides a more comprehensive version of the SCM through its categorization of human error, with unsafe acts, preconditions for unsafe acts, unsafe supervisions, and organizational influences (Wiegmann et al., 2022). Unsafe acts are set as the first level of HFACS and are divided into the subcategories of errors and violations. The next level, preconditions for unsafe acts, are conditions that increase the odds of allowing the unsafe acts. Unsafe supervision is improper oversight that enables or does not prevent an accident from happening. The last level is organizational influences, which are cultures, policies, or procedures enabling the accident to occur (Wu et al., 2023).
Theoretical Framework
There are many theoretical frameworks for human factors in aviation safety outside of HFACS and the SCM. However, the focus of this research paper will utilize these two in its evaluations. As already mentioned, HFACS provides a structured, layered approach to understanding human error by identifying unsafe acts, preconditions for unsafe acts, unsafe supervision, and organizational influences. HFACS comes from Dr. Shappell and Dr. Wiegmann to enhance Dr. Reason’s model to show the hierarchy of relationships between factors (Wu et al., 2023). The SCM will be used in addition to help visualize how multiple failures align to cause accidents.
The SCM is a theory used to understand how accidents occur within a complex system. It was created by Dr. Reason on the Theory of Active and Latent Failures. This theory explains that accidents occur from small mistakes lining up perfectly to culminate in an accident. The SCM is an easier visual depiction of this. It is illustrated as multiple slices of swiss cheese being stacked together. Each slice of cheese is a portion of the complex system and each hole in the cheese represents a failure within the system. It is unlikely that there will be a hole through the combined slices of cheese, but it can happen. Similarly, it is unlikely that there will be a consistent problem throughout a complex system that does not get fixed or stopped, but it is also not impossible (Wiegmann et al, 2022).
The slices of cheese are further “described as Unsafe Acts, Preconditions for Unsafe Acts, Supervisory Factors, and Organizational Influences” (Wiegmann et al., 2022, p. 119). Each level is a dynamic condition that must be failed to continue on to the next. Additionally, each hole or failure can be categorized as an active or latent failure. Active means that immediately addressing the issue can stop the failure. Latent is a systematic failure that requires organizational influences and supervisor factors to change to fix the failure. Active failures often require a reactive response, while latent failures need to be addressed proactively.
The HFACS framework creates a hierarchy within the levels of human error. Unsafe acts, at the bottom, consist of errors or violations. Then, preconditions for unsafe acts set the conditions that allow for or cause the unsafe actions to occur. Unsafe supervision is the lack of proper oversight, which prevents the ability to catch an error or fails to correct a failure that takes place. Finally, organizational influences are at the highest level, which creates cultures, policies, or procedures within the organization that enable accidents to happen (Wu et al., 2023).
The lowest level on HFACS is unsafe acts. This layer involves the actions of individuals that directly influence the unsafe situation. Unsafe acts can be further divided into errors and violations. Errors are made in decisions, skills, and perceptions. Violations are deviations from standards and regulations (Wiegmann et al., 2022).
The next level of HFACS is the preconditions for unsafe acts. The conditions that enable individuals to commit unsafe acts are explained above. Preconditions include environmental factors, personnel factors, and conditions of operators’ mental or physical state. Environmental factors can be physical or technological. Personnel factors include crew resource management and personal readiness. “Holes that occur at the Unsafe Acts level, and even some at the Preconditions level, represent active failures... they are seen as being actively involved or directly linked to the bad outcome" (Wiegmann et al., 2022, p. 120).
Next is unsafe supervision, which reviews the role of the supervisor’s general practices and their decisions leading to the accident. Unsafe supervision can be subdivided into four categories; inadequate supervision, planned inappropriate operations, failure to correct problems, and supervisory violations (Wiegmann et al., 2022).
Last is organizational influences, which is the broadest level on HFACS. This layer evaluates the higher organizational management and systematic factors within the organization. Key factors within organizational influences are resource management, organizational climate, and organizational processes. "Latent failures occur higher up in the system (unsafe supervision and organizational influences)... They are referred to as 'latent' because when they occur or open, they often go undetected” (Wiegmann et al., 2022, p. 120).
HFACS can be, and regularly is, integrated with other theoretical models like Dr. Reason’s SCM. Where HFACS details and analyzes human errors, the SCM provides a visual representation of how failures aligned to cause the accident. This type of integration enhances HFACS to be a more comprehensive tool for analyzing human error in aviation, which helps spot trends and systemic problems.
HFACS of Germanwings Flight 9525
The BEA (2016) investigated of the Germanwings Flight 9525 accident, the HFACS framework used in the investigation showed failures within the system, or holes in the swiss cheese. The unsafe act is the copilots deliberate action to descend the aircraft into terrain. This is an active failure that directly contributed to the accident. If there was a reactive fix to this failure, then the event would not have occurred. Next, the preconditions for the unsafe act are the copilot’s mental health and the inability to access the locked cabin. The copilots mental unwellness was diagnosed and medicine was prescribed. The fact that it was known and not communicated to authorities was a significant factor enabling the unsafe act. Additionally, the inability to access the locked cabin by the captain was a precondition that allowed the unsafe act to happen.
Unsafe supervision was another layer in the system that failed. The copilot’s mental health issues not being adequately monitored since the failed medical examination in 2008. Not being reexamined at any point by an aeromedical examiner for mental health was a failure on the supervisory portion of the system. Finally, the organizational influences are the airline’s policy on pilot health. The organizations lack proactive health checks were another failure that inadvertently led to the crash. All these human factor failures happened without intervention, lining up holes in the system causing the crash.
Due to the Germanwings Flight 9525 crash, there is a reality of issues that need to be addressed. Organizations need to improve their mental health screening and mental support for members in the aviation industry. They also need to implement policies and practices for health reporting and re-evaluations. Wu et al. (2016) has shown the prevalence of mental health issues in the aviation community, so this is not just a one-off case.
HFACS of Shaw AFB F-16CM
The USAF (2020) conducted an AIB evaluating the Shaw AAFB accident using the HFACS 7.0 model. It was determined that the event was caused all, or in part, by the following factors: procedure not followed correctly, environmental conditions affecting vision, distraction, and supervisory/command oversight. The unsafe act was that the procedure was not followed correctly at the short final, the mishap pilot steepened his descent angle prematurely (USAF, 2020). As Wu et al. (2023) suggest, "Accidents should be viewed as control issues rather than failure issues, and they may be prevented by setting limitations on component behaviors and interactions” (p. 2). This highlights the need for training in night-time operations and instrument conditions to better control the situation.
The incident was also caused by a precondition for the unsafe act by having the mishap pilot fly a night-time air-to-air refueling mission without experience. The environmental condition of flying at night affects the pilot’s vision by reducing visibility and causing reliance on aircraft instruments. The night conditions exacerbated the complexity of air-to-air refueling and the aircraft landing (USAF, 2020).
The pilot was also influenced by distractions, falling under the HFACS layer of unsafe acts and errors. The mishap pilot was focused on his failure to complete his night-time air-to-air refueling mission, distracting him from his current operational objective of landing the aircraft. The distraction in this incident shows psychological human factors on skill performance (USAF, 2020).
There was also a failure of the supervisor of flying and the mishap pilot’s command leadership. Command approved the night sortie without prior completion of similar tasks, such as air-to-air refueling in the daytime, putting the mishap pilot in the position to fail. The supervisor of flying decided to not follow regulations, which led to a supervisory oversight and the failure to land the aircraft safely (USAF, 2020).
The final HFACS framework shows that the mishap had causes at each layer of the model. Unsafe acts on the pilot, preconditions for unsafe acts from the night sortie, unsafe supervision from the supervisor of flying, and organizational influences from the mishap pilot’s fighter wing. Using Dr. Reason’s SCM, a visual can be made that the holes in the layers of cheese (failures) lined up to create a pathway through (accident).
Literature Review
Literature was reviewed from scholarly sources and official aviation accident reports to reveal common human factors that contribute to aviation accidents. Variables such as pilot error, organizational culture, and technological factors were key contributors to the accidents. With the limited number of accidents investigated, there are potential outliers that may include unique environmental conditions, such as flying at night, or system failures that require specific attention in aviation safety protocols. As noted in the literature, "Accidents should be viewed as control issues rather than failure issues, and they may be prevented by setting limitations on component behaviors and interactions" (Wu et al., 2023, p. 192).
Methodology
Qualitative research was conducted on scholarly literature on human factors. This review also includes a comprehensive review of the two aviation accident reports. Reviewing failures in aviation requires various approaches, such as case studies, accident investigations, and simulations. In this research, there is no active simulation, only the reported simulation results from the investigations. All data will be synthesized, and a final review will then be conducted using the HFACS model. Then, a discussion of implications will provide a final, comprehensive understanding of human factors failures in the accidents.
Implications
There are significant implications for aviation safety professionals, policymakers, and stakeholders. The data from the accidents have implications that give future insights to prevent future human error. The insights from these two accidents are to develop targeted training programs, amend regulatory policies, and change organizational practices. Stakeholders must take the causes of the studied accidents and implement changes to create a proactive approach to safety and to prevent future accidents.
The implications are that there needs to be additional training or more clear guidance on operating the checklists. Wu et al., (2023) suggest an understanding of the systems can create more extensive policies and practices, writing "the usefulness of the system dynamics models depends on its ability to provide endogenous explanations of observed behavior and support policy design" (p. 5). This shows that the complexity of human factors in HFACS can be used for systematic changes that could prevent future human errors and aviation accidents.
Future Research
Future research should explore gaps within the current HFACS model and its application to aviation safety. There is a growth of technology and AI in aviation that may present a new layer of the HFACS model. Not technology and AI itself, but the human interaction with it. Andrei et al. (2021) explain, “This 'human-machine' interaction must be designed in such a way that the operation of the machine is based on human characteristics, in order to limit potential error” (p. 68). One specific projected problem is human complacency or overconfidence from automation. Implications from this new layer could potentially show a trend toward the need for incorporating additional technology and AI training programs.
Conclusion
Human factors in aviation are a complex and dynamic system that requires a systematic and comprehensive review to fully understand. The HFACS and SCM within this research paper are frameworks that allow researchers to break down the causes of accidents within multiple layers of human factors. The SCM allows for a visual representation of these human factors adding up to an accident within a complex system.
The reports from the BEA (2016) and USAF (2020) determined that Germanwings Flight 9525 was a deliberate action by the co-pilot, and the Shaw AFB crash was a series of errors in a training mission. Further evaluation with HFACS determined more than pilot errors and faults. There were also inadequate training preconditions and organizational influences with inappropriate safety cultures.
The HFACS framework allowed for a complex analysis of human factors within the two accidents. This can further be used by safety personnel, aviation policymakers, and other stakeholders to proactively address issues. Future research should continue the HFACS model by expanding it to include emerging technologies, such as AI.
References
Andrei, A.-G., Balasai, R., Costea, M.-L., & Semenescu, A. (2021). Overview Regarding Human Factors in Aviation. Annals of the Academy of Romanian Scientist Series on Engineering Sciences, 13(1), 67-7. https://www.researchgate.net/publication/368564559_OVERVIEW_REGARDING_HUMAN_FACTORS_IN_AVIATION
Bureau d’Enquetes et d’Analyses (BEA) (2016). Final Report: Accident on 24 March 2015 at Prads-Hate-Bleone (Alpes-de-Haute-Provence, France) to the Airbus A320-211 registered D-AIPX operated by Germanwings. https://bea.aero/uploads/tx_elydbrapports/BEA2015-0125.en-LR.pdf
United States Air Force (USAF). (2020). Aircraft Accident Investigation Board Report: F-16CM, T/N 94-0043. Air and Space Forces Magazine. Retrieved from https://www.airandspaceforces.com/app/uploads/2020/11/F-16-Mishap-AIB-30-June-2020-Shaw-AFB-ACC.pdf
Wiegmann, D. A., Wood, L. J., Cohen, T. N., & Shappell, S. A. (2022). Understanding the "Swiss Cheese Model" and its application to patient safety. Journal of Patient Safety, 18(2), 119-123. https://doi.org/10.1097/PTS.0000000000000810
Wu, Y., Zhang, S., Zhang, X., Lu, Y., & Xiong, Z. (2023). Analysis on coupling dynamic effect of human errors in aviation safety. Accident Analysis and Prevention, 192, 107277-107277. https://doi.org/10.1016/j.aap.2023.107277