AIN Blog: NASA Advances Display for Airport Surface Operations

 - April 11, 2012, 10:00 AM
NASA Langley head-worn display
NASA Langley Research Center engineer Kevin Shelton models a head-mounted display system that would give pilots a virtual view of the airport surface. (Photo: NASA Langley/Sean Smith)

Lightweight, head-worn displays designed for military aircraft might eventually find a home in commercial cockpits as well. Researchers at NASA Langley Research Center in Hampton, Va., have developed an eyewear clip-on display for use by pilots during airport surface operations, the subject of recent solicitation to industry for possible commercialization. The miniature display with an associated head tracker and speech recognition component thus far have been tested only in a lab environment, but could be flight tested by next year, says Trey Arthur, a NASA research electronics engineer.

The NASA system received a patent in 2010 as a “multi-modal cockpit interface for improved airport surface operations.” Aimed at improving upon the situational awareness afforded by head-up displays (HUD), which have a limited field of view, the system consists of a head-tracking device, a processing element and a head-worn display. The processing element receives head position information from the head tracker and the aircraft’s current location from GPS or an inertial navigation system. It presents a real-time virtual airport scene on the display corresponding with the pilot’s head position and aircraft location. The processing element could also convert voice commands from the pilot using speech-recognition software, for example, to select among display modes or adjust for brightness.

Researchers have developed a system using prototypes of a soldier’s monocular display from Rockwell Collins and a head tracker from Intersense. While the team considered different sensor inputs for the display, including automatic dependent surveillance-broadcast, enhanced vision system (EVS) and synthetic vision system (SVS), the effort is now focused on presenting a virtual view through SVS, which would require an onboard terrain database and precise navigation source. EVS hasn’t been ruled out, says Arthur, but would require multiple, tiled cameras or a camera that slews with the pilot’s head to provide a full field of regard.

NASA has researched helmet displays for decades but started working on smaller displays in the last five years, notes Arthur. Five years ago, BAE Systems introduced its miniature Q-Sight monocular display, which approximates the NASA concept. The UK Royal Navy ordered the four-ounce, clip-on display in 2009 as a gunner’s sighting system for the Lynx Mk8 helicopter. The Q-Sight would also factor into BAE’s alternate helmet-mounted display for the F-35 Joint Strike Fighter, a development started after Vision Systems International encountered technical problems with its incumbent helmet-mounted display.

The technology’s next big wave could extend to airliner cockpits, where head-worn displays would enhance situational awareness during taxiing and protect against runway incursions. “The question is, what are the issues with putting something near your eye that gives you HUD-like capabilities. That’s where we’re at in the solution of this,” Arthur concludes. “If you had a display that was on the order of sunglasses, would that be something that a commercial crew would accept?”