Rarely do complex questions have a simple answer. Rarely do problems have an isolated cause. While a Boeing study found that 80 percent of accidents occur because of human error, the reasons behind those errors are hard to isolate and not subject to simple analysis. The rising number of experts in the field of human factors clearly indicates that the aviation industry is trying to problem-solve, but a deeper analysis might be beneficial.
Many technologies were put in place to eradicate specific human errors—for example, emergency descent mode (EDM) to shield against another Payne Stewart accident or geosynchronous overlay on approach charts to protect against a future Cali, Colombia disaster where the crew lost situational awareness close to the ground.
Technological advancements continue to focus on safety and indeed have enhanced safety for general aviation to air carrier operations. Improvements range from airframe parachutes to new surveillance tools like ADS-B. Predictive weather radar and combined vision systems are pushing the envelope even further to enhance safety.
To meet the demands of the evolving technological improvements, human factors experts have focused on human-to-machine integration. This new age of aviation safety began with warnings about the dangers of the “children of the magenta” or pilots relying too heavily on automation.
Despite all these technological advancements, human error remains the most-cited cause of accidents. In many regards, pilots have been trained as if they were machines; we are trained to react both mentally and physically without conscious processing in an emergency.
But in emergencies or other high-stress situations that require conscious processing, we’ve adopted tools such as the crew resource management (CRM) model to use all available information to maximize safety and efficiency. Along with CRM, pilots have aeronautical decision making (ADM) and threat and error management (TEM) in their human factors toolbox. Decade after decade we saw a new aviation safety process emerge as some extension of or augmentation to the previous model.
What if there’s a better way? What if staying in this line of linear growth and layer after layer of new safety processes and tools blinded us to the ontological pivot necessary to analyze human error organically and responsively? It is time for the industry to take an interdisciplinary approach for a more comprehensive proactive safety system with cognitive processing as its fulcrum. When we put cognitive processing to work, we don’t just acquire knowledge; we build on that knowledge and gain insights that allow us to look at a problem by consciously merging input from a variety of sources to turn information into suitable action.
Our ‘Three Brains’ and SMS
The human brain is an impressive and complex sorting machine. It receives 11 million bits of information per second but processes only about 40 bits per second. This means that 99.99 percent of the information we receive, we cannot process consciously.
To catalog incoming data more quickly, part of the brain makes mental models and forms shortcuts, some of which enact cognitive biases. These biases were important for our evolution in moments of time-pressured decision-making: house cat or saber tooth tiger, friend or foe. We’ll think about this functionality as our primitive brain.
Consider the three brains that collectively make up the human brain:
- The primitive brain keeps us alive by triggering the fight, flight, or freeze response.
- The emotional brain is home to memories and experiences. It’s a blank slate that is programmed based on our assumptions, beliefs, and experiences. This part of our brain helps us have empathy for others, but it also might make us act irrationally or emotionally when triggered.
- The thinking brain is the higher-level processing part of our brain responsible for problem-solving. This is where creativity comes from. We do our best work from our thinking brain.
The brain can help us be critical thinkers and brilliant innovators and allows us to have empathy for others. It can also make us perpetuate antiquated models and outdated stereotypes when operating on overly simplified prototypes. It depends on which part of the brain you’re operating in, and this impacts your safety culture.
Is Your Primitive Brain Hurting Your Safety Culture?
The primitive brain is an impressive (and not always accurate) cataloging engine that is always on high lookout for potential danger or threat. Over the past few centuries, and especially in Western modernity, our perception of threat has shifted from predominantly physical to primarily mental. These mental threats activate the same fight, flight, or freeze response, triggering us to use cognitive shortcuts or biases in our decision-making. These shortcuts prevent us from seeing the full frame of human error while also impacting our decision-making process. Similarly, our response to perceived mental threats inhibits us from viewing the system comprehensively and limits our potential to problem-solve.
Understanding how our biases impact others and how we are impacted when we perceive bias against us directly affects safety. Our ability to communicate, our likelihood of self-reporting, and our view of a just culture within our safety management system (SMS) are all impacted by cognitive biases.
A Harvard Business Review study found that employees who perceived bias against them at work were more likely to disengage from work, leaving them feeling angry and less proud of their organization. Additionally, these employees were three times more likely to quit their job within the year. There’s a deterrent to safety in a micro view (within the attitudes of individuals at the organization) and in a macro view of low retention rates. It also has a direct impact on your organization’s safety culture.
To avoid operating in our primitive brain (which throws our thinking brain offline), we need to feel that we are secure and not under threat. Psychological safety is the amount of relational trust one feels in his or her environment. It means feeling comfortable to speak up, admit mistakes, and be your authentic self. It also means a safer employee and safer organization because it creates an environment where relationships are rooted in trust.
With a high level of psychological safety, the team has a sense of belonging and inclusion. This builds trust, which is fundamental to an organization’s SMS as it directly impacts an employee’s willingness to self-report, the unpinning of an effective proactive safety program.
Proactive safety is highly dependent on the individuals within an organization. To elicit a collaborative approach to safety, there must be a strong culture of self-reporting and an intrinsically just culture throughout the organization. This type of proactive safety requires a high level of psychological safety.
Overcoming the Gap in Human Factors
There is a gap in aviation safety. That gap is how to build trust, how to increase psychological safety, and how to promote a positive safety culture.
Our industry uses a host of data-driven approaches to enhance safety (ASIAS, FOQA, and ASAP are examples) but we have little academic research on the impact of cognitive biases on organizational safety culture. I aim to resolve this gap through doctoral research.
Indirectly, the industry already has the regulatory requirement and justification for the expansion of human factors training. Now the FAA requires Part 121 operators to develop and implement an SMS program (14 CFR Part 5), and it’s a matter of time before it’s required throughout the whole industry (much like the history of CRM).
Once an SMS program is established, indicators of compliance and performance include training personnel on “non-technical skills with the intent of reducing human error” (ICAO Annex 19 component 4). The rationalization for the expansion of human factors training resides within the very structure of the system itself. The safety promotion pillar requires training, communication, and actionable progress on enhancing a positive safety culture. The safety policy pillar obligates senior management to commit to the constant improvement of safety. It is, inarguably, the accountable executive and safety officer’s responsibility to find ways of enhancing safety through positive safety culture promotion initiatives. This compels us to ask the same question: how?
I advocate that the industry needs aviation-specific, academically derived cognitive bias research, which will provide the structured foundation for a formal expansion of human factors training to include elements such as emotional intelligence, psychological safety, inclusive leadership, and cognitive biases training as an approach to enhance safety culture.
The neuroplasticity of our brains allows us to reframe how we think to have more control over our decision-making process. We can use self-awareness to pause and examine whether our brains are operating from the primitive or emotional parts or whether we are operating in our thinking brain. This process actually builds new neural pathways, and it becomes a habit. We can leverage this neuroplasticity to create new pathways of mitigating operating on negative biases. In the ways that we’ve all been trained to recognize and intervene in the accident chain, the same principles apply here. We recognize a pattern and we’re able to stop it. But that requires awareness and training.
If human error is the problem, the new era of aviation safety must begin with a granular analysis of cognitive processing. We cannot enhance safety without a better understanding of our cognitive biases and their impact on flight deck safety and organizational safety culture. An interdisciplinary research approach will target this pervasive safety gap to reduce human error and enhance the effectiveness of an organization’s SMS.
Contemporary cognitive science research is the new era of aviation safety. We can no longer afford to sit back passively in our zone, waiting for research in human factors to be done in other fields (refracted through their institutional logics) before taking it up and grafting it into aviation safety systems. For cognitive science research to accurately inform aviation, someone within the industry needs to be in the “flight deck” navigating the research.