AINsight: Adaptations, Groupthink Down Twin Otter

 - December 22, 2022, 12:40 PM
de Havilland DHC-6-300 Twin Otter after the forced landing in the muskeg 6.7 NM northwest of Fort Providence Aerodrome
A de Havilland DHC-6-300 Twin Otter crew was forced to ditch the aircraft in a muskeg 6.7 nm northwest of Fort Providence Aerodrome on Nov. 1, 2021. The Transportation Safety Board of Canada drills down into group dynamics to find the root cause of fuel starvation, which led to the crash. (Photo: Transportation Safety Board of Canada)

In November 2021, a Canadian-registered de Havilland DHC-6-300 Twin Otter operating as a scheduled flight ran out of fuel and was forced to make an off-airport landing. The final accident report was just released and did not identify any mechanical issues with the aircraft, medical or physiological concerns with the flight crew, or any problems related to the weather. The pilots simply failed to check the fuel quantity prior to takeoff.

During the crash, the Twin Otter was substantially damaged. The two pilots and three passengers survived the crash and were rescued four hours afterward; each was treated for minor hypothermia after being exposed to freezing temperatures.

The Transportation Safety Board (TSB) of Canada through its analysis determined several causes and contributing factors to the accident. The most interesting findings relate to human factors and the development of adaptations when using checklists and how group dynamics can negatively influence flight safety.

Sufficient fuel to fly from “point A to B and an alternate (if required) plus required fuel reserves” are one of those “simple stupid things” that pilots learn during flight training.

For this flight, the operator normally planned to fuel the aircraft for a roundtrip plus the required reserves. In this case, a total of 2,500 pounds of fuel was required for this flight from Yellowknife Airport (CYZF) to Fort Simpson Airport (CYFS) in the Northwest Territories of Canada. 

Commercial operators normally have “checks and balances” to ensure that each flight departs with sufficient fuel in the tanks. At this airline, the flight crew would normally request fuel through the company’s flight coordinator, who would then plan the fuel load with the fueling vendors. On the day of the crash, the pilots likely did not make a request to refuel the aircraft with the flight coordinator. 

During preflight, as the captain entered the aircraft, he observed a fuel receipt in the door map pocket of the aircraft, which reinforced his belief that the aircraft had been fueled. He did not read the fuel receipt—it was from a flight three days before the accident flight. 

Checklist Adaptations

The last line of defense to ensure that there is sufficient fuel on board are cockpit checklist items that compare the fuel required versus the actual fuel on board the aircraft before departure and during flight. In this case, there were at least three opportunities to cross-check the fuel on board. 

During the before-start checklist, the proper response from the captain should have been “Fuel Quantity Checked—2,500 pounds.” Instead, the captain performed the checklist from memory and the fuel check was missed due to an interruption.

Next, during the taxi checklist—a challenge and response checklist—the first officer should have announced “Fuel System” and the appropriate response from the captain would confirm the proper status of the fuel system (“Normal, Tips On/Off, No Lights”) and restate the fuel quantity on board the aircraft (“XXX pounds.”).

Again, the captain performed the checklist from memory without any involvement from the first officer—no challenge or response. At this point, there was less than 600 pounds of fuel onboard the aircraft, nearly 2,000 pounds less than the planned amount. 

Once level, in cruise, the pilot monitoring (PM)—the first officer in this case—was required to complete the cruise checklist and should have read “Fuel” and confirmed fuel system status (“Normal, Tips On”), and the quantity of fuel remaining at the destination (“Landing with XXX pounds”). Instead, the first officer completed the cruise checklist silently without any reference to a checklist. As a result, the low fuel state of the aircraft was never identified.

The cruise checklist was the last opportunity for the flight crew to note the actual fuel on board the aircraft, analyze the amount of fuel required to complete the flight, and provide an assessment of the fuel remaining upon landing at the destination. The required minimum amount of fuel remaining at the destination must be equal to/or greater than the required 45-minute reserve. For a Twin Otter, the minimum amount of reserve fuel would be approximately 450 pounds). 

The Crash

According to the fuel burn analysis section of the accident report, the low-fuel-level caution light for the aft fuel tank illuminated 26 minutes after takeoff—the crew did not notice it—and at this point, there was 402 pounds of fuel on board, which equates to just 40 minutes of flying time at normal cruise power settings.

Approximately 13 minutes later, the flight crew noticed the low-fuel-level caution light and realized they had departed with insufficient fuel. At this point, the aircraft was level at 6,500 feet and roughly halfway between Yellowknife and Fort Simpson with not enough fuel to continue to the destination or return to the departure point. 

Almost immediately upon realizing the low fuel state of the aircraft, the flight crew began a diversion to Fort Providence Aerodrome (CYJP), which had the closest-available runway.

To conserve fuel, the captain climbed the aircraft to 7,000 feet and reduced power. Next, the captain informed the company’s flight coordinator of the situation via the aircraft’s satellite radio. In a further attempt to conserve fuel, the captain intentionally shut down the left engine. Power was then reduced on the right engine and the pilot began a slow descent. 

Approximately nine minutes later and 11 nm from CYJP while descending through 3,300 feet, the right engine began to surge due to fuel exhaustion. The flight crew elected to shut down the remaining engine and feathered the propeller and established best-glide airspeed. Four minutes later, the aircraft touched down on the muskeg—a bog or peatland—6.7 nm northwest of CYJP.


In its report, the TSB points out that “flight crew checklists are designed to ensure pilots complete a list of tasks in an appropriate order, without omission.” Specifically, checklists reduce the probability of any omissions since they remind the pilot of each required step in the required order and increase the probability of detecting any omissions because when they are read aloud, the other pilot can hear if a step is missed.

The TSB cautions that “to be effective, the operating culture and CRM must encourage and maintain the practice of using checklists routinely.” The improper use or not using a checklist is frequently cited as a causal or contributing factor to an accident. In these cases, a critical step is missed due to an interruption or distraction, or the pilot has intentionally omitted a checklist because of an adaptation or “workaround” to the required cockpit routine. 

In aviation, policies and operating procedures are established to set boundaries for safe operations. According to the TSB, individuals may experiment with those boundaries to become more productive or to obtain some other benefit. 

As an example, a pilot may shift focus from the safety of flight (threat-oriented) to the achievement of the completion of a flight (goal-oriented). Following this shift in focus, the risks associated with the flight may not remain as low as reasonably practicable. When the focus changes to goal-oriented from threat-oriented, risk-taking behavior may increase resulting in adaptations to policies and procedures and, over time, unsafe practices. 

In the example of performing a checklist from memory, if there are no negative repercussions then the individual will continue taking risks. Over time, they may view this as a success or a reward (the goal is accomplished) and become desensitized or habituated to the level of risk taken. This influences future risk-taking behavior.

During this investigation, the TSB found that a few experienced Twin Otter captains at the airline—many that had previously been first officers in the model—had developed the unsafe practice of performing checklists silently or by memory only.

According to the report, several first officers suggested that this had become the routine for those captains. A few junior first officers, including the occurrence first officer, noted that these captains would often perform checklists silently or by memory only without input from other crewmembers.

The first officers would passively accommodate this practice without objection or filing any safety reports—thus, there were no negative repercussions (peer-to-peer) for the noncompliant captains and the company had no awareness of the issue. The report noted that a group of first officers who disagreed with this practice would only discuss it amongst themselves or confide with some of the training captains.

For this group of captains, the practice of not using checklists properly became the new norm, although the TSB report said they would be compliant during check rides. This practice affected the rest of the pilot group.

The TSB noted, “Individuals are often unaware of how or when they have been influenced by other people. As a result, they may end up making decisions or changing their behavior in a way they would not normally choose to do.” It continued, “Whether someone is influenced by another individual or group of individuals depends on many factors, such as experience, seniority, personality, social status, or motivation.”

Likewise, the TSB in its report identified two types of influence: normative and informative. Normative influence is driven by the expectations of others because they perceive it is expected—the norm that is socially preferred. The individual may not completely believe in this decision or behavior, but if the remaining group believes in it, then the individual may follow.

Conversely, informative influence is driven by information, because their opinions have been influenced and they now believe in the decision or behavior. 

The TSB added specific examples of influence to include compliance, conformity, and groupthink.

Compliance is when an individual performs a task just because someone has requested them to do it or not do it. Accordingly, the probability of compliance is often influenced by previous requests and the status (for example, seniority or rank) of the requestor. Obedience is different in that the individual follows a direct order.

Conformity is when an individual gradually changes their behavior to make it more in line with the group norm. The individual is aware of the behavior and attitude of the remaining group. 

Groupthink is when the motivation of individuals in a group to maintain “group consensus” overrides their motivation to evaluate all potential courses of action.

In its basic form, the TSB report in its analysis could have delved into simple causations such as turbine-powered aircraft needing fuel to function or pilots should simply use the checklist. But the Board instead chose to dig deeper into the reasons why individuals or groups of individuals make a conscious decision to not follow prescribed procedures.

Procedural drift or procedural noncompliance are nice “canned” terms that would describe the next level down into the analysis, but for air safety investigators getting into the “why” such as adaptation and group dynamics and influence provides the industry with a better understanding of how an air-taxi aircraft, with paying passengers aboard, can run out of fuel well short of its intended destination. 

The opinions expressed in this column are those of the author and not necessarily endorsed by AIN Media Group.