Terrain-avoidance warning systems (TAWS) technology, which has been credited with preventing several potential major accidents, underscores the need for continued flight-operations vigilance, especially during the approach and landing phases, according to safety consultant Capt. Dan Gurney.
“Industry [must] maintain focus on the problems of human error, particularly situations that have potential for error or contain threats that were not identified or were mismanaged. [We still have] much to learn (or remember) about how to identify and counter latent threats,” the former UK Royal Aircraft Establishment and British Aerospace Regional Aircraft test pilot told the Flight Safety Foundation European safety seminar.
Large airplanes began to carry ground-proximity warning systems in 1974, and since then the number of controlled flight into terrain (CFIT) events has declined significantly, said Gurney, who analyzed six sample incidents in which TAWS prevented potentially fatal CFIT accidents.
Ground proximity warning systems (GPWS) were extended to commuter operations in 2000. But that technology is limited to terrain detection immediately below–rather than ahead of–the aircraft; it does not detect a sharp change of forward terrain until it is too late to avoid it.
Further, landings free from unwarranted warnings require the GPWS detection logic to be disengaged, so the system cannot issue a warning if ground clearance becomes insufficient.
Enhanced GPWS (or TAWS) combines a digital terrain database with accurate navigation equipment to warn of a perceived discrepancy between aircraft navigation position and the ground almost down to the runway threshold. It advises if steeply rising ground is detected ahead. Now, warnings of obstacles as well as terrain are available, and the equipment is mandatory for all turbine-powered aircraft with six or more seats.
Since EGPWS/TAWS entered airline and corporate-aviation operations in the past five years there has been no CFIT accident involving such an equipped aircraft, said Gurney. Nevertheless, there have been a number of potential mishaps, including serious incidents requiring formal investigations (for which published reports are expected). Other incidents have been studied by operators or manufacturers trying to understand what prevented crews from detecting the circumstances that triggered TAWS warnings.
Gurney’s analysis of six incidents since March 2003 involving premature descents for landing has found many threats and errors that pilots may encounter routinely. “Most crews on most days will manage these and avoid error-prone behavior, but [circumstances] arise where threats and opportunity for error overcome human capability and a technological solution is required. Ultimately, the crew [is] responsible for managing threats and avoiding error before and after the [TAWS] alert,” said Gurney.
The incidents involved two kinds of threat: pre-existing conditions not posing independent risk (such as airport susceptibility to “black-hole” illusion; charts without altitude/range tables; ambiguous procedures; unsatisfactory chart layout, scaling or format; non-precision approaches; and offset distance-measuring equipment) and conditions arising from variable situations (such as darkness; instrument flight rules; late plan change; or failure to react to alerts or warnings).
Improving Risk Assessment
Crew vigilance or audit management would have identified all the pre-existing conditions, said Gurney. Risk assessments should consider other likely conditions that could increase risk. For example, flying a non-precision approach (NPA) in conjunction with weak charts is “a particularly high risk.”
Gurney suggested that the errors apparently made in the events he analyzed appeared to originate from circumstantial conditions, or unidentified or mismanaged threats. Crews did not understand situations, or they chose incorrect courses of action. Misunderstanding included visual illusion or misidentifying visual cues, misinterpreting or misunderstanding procedures, not having or sharing mental models and suffering mental map slips. “These errors originate in the cognitive (thinking) processes– what we think about, how and on what we focus, and why we think that something is important,” said Gurney.
Choosing wrong courses of action often involved simple mistakes or memory lapses, perhaps through lax training or poor discipline. “They originate from weaknesses in cognitive control– the way in which we control thinking: self-discipline, double checking, managing time, avoiding preconceptions and not rushing to conclusions.”
The errors should have been detected with self- or cross-crew monitoring. “It is essential that individuals self-debrief to clarify their understanding of any error, the situational circumstances and threats, or the behavior that may have led to the error.”
Monitoring, which had failed in every incident he analyzed, must be accurately defined, trained and practiced to enable skillful application, according to Gurney. “Monitoring must be truly independent, [which] starts with the approach briefing. Each pilot monitors by crosschecking the [chart] details and his understanding of the plan for the approach. Briefings–flight plans for the mind–provide a pattern for subsequent comparisons. The crew needs a shared mental model [that must] be the correct model for the situation.”
With people tending to build internal models (patterns) of how things should be, Gurney advises that crews must guard against short-term tactical thinking where response to expectations often dominates sounder assessment and judgment in strategic thought. “[They should] make an earlier consideration of what a situation could be, consider options and alternatives, and if in doubt ask. In [every] incident, the crew lost awareness of position relative to the runway in altitude, distance and time.”
Since the objective was to land safely, crews’ focus must include situational awareness of the runway location and continually updating the mental model. They should use available physical tools, said Gurney: “Display runway position on the EFIS [screen], pay attention to vertical displays, and select terrain maps for all approaches as well as
Gurney emphasized the need to react to alert messages. “TAWS warnings require action without thought. To gain this skill crews need to practice pull-up techniques in response to a TAWS warning, [preferably] in surprising, stressful train- ing situations. Use a ‘glass mountain’ terrain model during simulator training.”
In debriefing, if crews argue that “they ‘knew where they were’ and there was no terrain threat,” they must recognize that this was “exactly the erroneous mindset that all the incident crews may have held. They were convinced that they knew where they were and ‘It was the TAWS warning that was wrong,’ not them. It is essential that training overcomes the desire to understand before acting; a pull-up must be flown without hesitation,” he explained.
Gurney pointed out that in the analyzed incidents, the industry was fortunate to maintain its good safety record. Every event involved aircraft with modern “glass” cockpits with equipment to enhance situational awareness, but each had been exposed to terrain hazards. In most cases, crews were apparently unaware of their position. In two, the aircraft were at very low altitude, yet were still 1.5 nm from the runway. The single incident triggering an obstacle warning involved the only operator’s aircraft with ‘obstacle mode’ activated.
All identified potential threat conditions “must be reported, removed, avoided or any residual effects countered. Crews [must] recognize situational threats as they are the last line of defense,” according to Gurney. “Luck could be defined as having safety defenses that just matched the hazard or risk. However, [when each] incident involved the crew pulling up following a warning, this definition of luck is unacceptable. We cannot expect that the last line of defense always to hold.”
Active threat and error management, at all management and operational levels, requires constant vigilance, risk assessment and timely decisions to select corrective courses of action, concluded Gurney. “These processes depend on critical thinking skills–the foundations of airmanship, leadership and professional management.”