Panel examines effect of culture on av safety

 - May 16, 2008, 10:59 AM

If aviation is essentially an Anglo-Saxon industry, and the rest of the world adopts safety measures that Anglo-Saxons devise, are these practices equally effective in other cultures? That was the question posed at a symposium entitled “Culture, Teams and Crew Resource Management,” recently held at Old Dominion University in Norfolk, Va.

“Safety, like beauty, is in the eyes of the beholder,” said Daniel Maurino, coordinator of the International Civil Aviation Organization’s flight-safety and human-factors program. He suggested that safety is influenced by hazards and risks, allocation of resources and the value placed on human life.

“I’m here to tell you that the value of human life is not the same the world over,” Maurino said. “The number of accidents that we are willing to accept before investing resources and money is the key. And that process is highly culturally biased.”

The two-day symposium was an outgrowth from the research of Donald Davis, associate professor of psychology at Old Dominion, to examine the role of culture in shaping how pilots work together in the cockpit.

According to the FAA and NTSB, it is commonly accepted that human error is responsible for most aircraft accidents. Only recently have authorities begun to look at a culture-centered approach to the study of flight-crew performance.

Davis found that culture may influence performance among aircrews, and definitions of what constitutes good performance may vary. For example, there is a wide variation on emphasis on task achievement vs preservation of harmony in interpersonal relationships. In some cultures, task performance is less important than the quality of relationships and loyalty to the group’s leader, Davis said.

Aviation accident rates also vary dramatically throughout the world, with the rates for emerging countries in Africa, Latin America and Asia more than eight times the rate of industrialized nations. Some of the variability in accident  rates is due to national differences in aviation infrastructure, such as aging aircraft and availability of navaids. Cultural factors explain additional variations, he said.

Perhaps a classic example of that is the August 1997 crash of a Korean Air Lines Boeing 747 during a nonprecision approach to the Guam International Airport in Agana.

The NTSB said the first officer and flight engineer failed to monitor and question the captain’s performance, which was causal to the accident. The NTSB speculated that perhaps they were uneasy or unwilling to challenge the captain.

One of the areas explored at a public hearing held by the NTSB was a suggestion that the Korean culture of respecting authority played a part in the accident. Although a KAL official disputed this, he admitted that the airline altered its pilot-training methods to encourage copilots to be more assertive with their captains.

Ray Justinic, an accident investigator for Delta Air Lines, has helped train crews for an unnamed Korean airline in CRM and leadership. What was supposed to be a six-month job has lasted more than three years. “I went there with the knowledge that you could have a no-culture cockpit,” he said. “I no longer believe there is no culture in the cockpit. You have to take in the culture of the country.”

One of the first things he learned was that the Korean culture was “very strong in saving face,” and airline officials blamed a spate of accidents on “foreign captains.” But he said there were no foreign pilots on two of the airline’s accidents that he reviewed.

He recalled that as a young man working for the Department of Justice, he was assigned to the Caribbean to put together a task force. “Down there I learned that for decision-making, the decision was to make no decision,” Justinic told the group. Later in Latin America, he said he ran into the decision-making process he described as “muy macho. I’ll make all of the decisions. I’ll do it now. I’ll shoot from the hip, but I’m gonna do it.”

A group of Korean pilots that Justinic brought to the U.S. for training on the Boeing 777 proved to be excellent in ground school, actual flying and in simulators, as well as in orals. But he said if they were given a problem that wasn’t in the manuals, where they had to make a decision, “they blew it.”

Justinic said that if a Western pilot was put into an abnormal position in the aircraft simulator, the first thing he did was disconnect the autopilot and try to recover. “The particular group I was working with, if you put them in an abnormal situation in the simulator, they engaged the autopilot,” he said. “The comfort factor was the automation. They had to follow the plan.”

Justinic said he also attempted to convince crewmembers that it was permissible to ask questions of their captain without showing insubordination, a technique that seemed to win favor among younger, mostly university-educated crewmembers. And he said his steering committee members for CRM/leadership programs were all from the company.

“I’ve changed from no culture in the cockpit to you have to have culture in the cockpit,” he said. “You can’t take the United Airlines matrix from 1980 and throw it over to Europe, throw it over to South America or throw it into Asia and say, ‘Here’s the perfect program.’” Justinic asserted that the basic philosophies are good, but you have to design it for the local culture.

ICAO’s Maurino said there has been an evolution in the safety thinking about the human contribution to accidents. In the 1950s it was generally thought that “bad outcomes” were caused by the actions of individuals. In the late 1980s and the early 1990s the accident model was changed to accidents being caused by failures in organizations.

“Starting in the mid-1990s, we moved on to think that accidents are caused by failures of cognitive compromises,” Maurino said, “largely through the brilliant work by people in Europe and North America, although we finished the work by
the Russians.”

The most significant cognitive compromise that operational individuals have to achieve, he said, is the compromise balance between production and protection. “According to traditional aviation knowledge, safety is first–right?” Maurino said. “Well, heck no. Dollars are first. The production system was not created to produce safety.”

He said that in the real world, people every day have to manage this compromise between the production goals of the organization and safety. “I would submit to you that the real expert in aviation is not the captain with 15,000 hours of flying time,” Maurino said. “The real experts are those captains who develop the knowledge and skills to master successfully the compromise between production and protection.”

Aviation is “extremely successful” in these compromises most of the time, and in the few times it fails, it is termed human error. But trying to help people master these compromises through normal teaching methods is “useless, probably, because the real world out there is not black and white,” said the former South American airline pilot.

“What I’m trying to say is that when we at ICAO, or the FAA or any other regulatory agency, sit down and think up a system, that system is a perfectly tried science,” Maurino said. “When we deploy the system in the real world, the line becomes wavy. And when the system goes into crisis, the line becomes chaotic. That is where experts make a living.”

He called on the aviation community to “capture normal practice” by observation through research and training, such as simulators. “If we want materials that will inform the real world, we need to observe normal practice,” Maurino said.

When pilots encounter unanticipated situations at decision height, which may not have been considered by regulators, the challenge is to “capture successful human performance,” Maurino said. “Then we will be in a position to tell people what to do after they make errors.”