Researchers at Southern Methodist University (SMU) are developing an innovative approach that combines biometrics with machine learning techniques to reshape the future of flight training. The goal is to measure physical reactions of the pilot to provide—in real-time—a more objective and automated determination of performance to make flight training more personalized, effective, and efficient. Flight training programs have historically relied on subjective observations and post-flight analysis from an instructor to determine proficiency and mastery of a maneuver.
Teamed with simulator manufacturer and training provider CAE, researchers from SMU’s AT&T Center for Virtualization are entering the fourth year of a project to develop and test methods to measure situational awareness and cognitive load sensing using biometrics and machine learning. The goal is to capture how pilots react to various scenarios in a flight simulator.
Machine learning is the study of algorithms that can improve automatically through experience and gathering data; it is seen as a part of artificial intelligence. The SMU project measures multiple physical reactions such as visual gaze patterns, pupil size, and heart rate to determine a pilot’s level of engagement, workload, situational awareness, stress, or fatigue. Remarkably, some of the early automated biometric test results align closely with the assessments of highly experienced human evaluators.
"Our theory is that biometrics during the simulation will result in much more objective and accurate measurements than asking users a few questions after the simulation to measure their experience,” said Suku Nair, director of the Center for Virtualization at SMU.
Eric Larson is the principal investigator on the study and is SMU’s associate professor of computer science and a recognized expert in machine learning with more than 53 patents and published papers. “Accelerating learning with biometric sensing is a difficult, unproven hypothesis," he said. "This research seeks to understand how sensing can be used to understand a learner’s mastery level in a difficult task, like flying an aircraft. We hope to advance the research field by being the first group to show whether personalized, automated learning can show efficacy in an actual learning scenario.”
Much of the early interest in the project was in support of a Department of Defense priority to automatically measure mission-critical, higher-order cognitive constructs, such as situational awareness, to accelerate training of complex skills and support multi-domain warfare. In 2019, SMU and CAE (then L3Harris Technologies) first demonstrated that machine learning based on biometric data could yield real-time accurate performance results. During these demonstrations, pilot eye scan techniques—using heat maps—were used to determine levels of situational awareness and mental workload.
CAE was drawn to SMU for its expertise in biometric sensing and machine learning. “Our research will yield the first real-time measure of situational awareness, a critical high-order cognitive construct for dynamic, high stakes domains such as military aviation,” said Sandro Scielzo, a principal human systems scientist at CAE. “For example, our machine learning classifiers could identify a breakdown in perception allowing the remediation of poor visual scans. A breakdown in comprehension could also be mitigated by ensuring students remain within the zone of maximal adaptability via real-time training complexity adaptation. Thus, mission readiness could be achieved more effectively and rapidly.”
To date, much of the data collection effort consisted of a repeated measures experiment using 40 test subjects with various backgrounds and experience levels flying a mixed reality flight simulator in a controlled environment. The simulator uses virtual reality (VR) to replicate a military fighter jet and incorporates visualization, head-up displays, and high-precision hand tracking.
Data collection equipment included a VR headset with an integral eye-tracker and a wearable wrist-worn device to collect other biometric parameters. The eye-tracking system collected gaze patterns, pupillary response, and eye blinks. The wrist device collected heart rate measurements, galvanic skin responses, electrodermal activity (EDA), and wrist acceleration. These biometric data points are then correlated through computer analysis to determine levels of cognitive load or mental effort (workload), stimulation, or stress.
A lot can be learned from collecting this biometric data. As an example, a “poor” gaze pattern, depending on the phase of flight, can be indicative of a high workload. A “correct” gaze pattern would show a higher level of attention and performance. Likewise, fewer eye blinks or blinks of shorter duration can be correlated to tasks requiring greater attention. Heart rate and heart rate variability can be used for tracking effort while performing a mental task. In addition to looking at a singular biometric parameter, SMU’s research performed comparative analysis to identify if there was a higher level of correlation between different parameters.
Key to this research is the ability to measure pilot workload and if there is excess capacity to perform additional tasks. Using biometrics and the machine learning algorithms, researchers could determine if the pilot subject was “loaded” or “unloaded.” Likewise, this project creates an automated means to objectively evaluate a student’s performance and the “level of ease” required to fly a maneuver.
Outside of the laboratory, the SMU/CAE team spent time at Edwards Air Force Base to demonstrate the feasibility and utility of the physiological sensor system as a flight test data source to objectively assess pilot workload. This test involved two flight test sorties in a Boeing C-17A jet and included aerial fueling maneuvers and lateral offset landings. A total of 33 maneuvers were recorded with good results. The study was deemed exploratory, and a hypothesis could be made that a “real flight” would capture a higher workload than the simulator data.
The use of biometrics and machine learning in a flight training environment may ultimately change the way pilots are trained. Physical reactions from a student may be a more reliable indicator of proficiency and mastery of a maneuver than the subjective view of an instructor or evaluator. While the SMU/CAE research validates the use of biometrics and machine learning on military aircraft, these processes, and techniques may be useful in civil aviation. The use of gaze patterns, as an example, may help train pilots to improve in areas such as active flight path monitoring, flight mode awareness, and the effectiveness of pilot monitoring duties.