Two Switzerland-based companies are preparing to launch a new vision-based guidance system for unmanned aerial vehicles (UAVs) that they said will mark a breakthrough in the use of artificial intelligence (AI) in autonomous aircraft operations. The new Magpie system jointly developed by Daedalean and UAVenture provides features such as safe landing advisories and visual navigation to sustain operations during GPS outages.
Magpie will soon be available to commercial UAV operators using UAVenture’s existing AirRails flight control system. Daedalean also is using the technology for its planned AI-based autopilot intended to support autonomous operations of both new generation eVTOL aircraft and existing rotorcraft and fixed-wing aircraft. It intends to add computer vision and vision-based functions for its autopilot, which it says will eventually be the first certifiable autopilot to support autonomous aircraft operations to Eurocae's highest DAL-A level.
According to Daedalean, the Magpie system “is the ideal demonstrator of the safety-certifiable neural networks that process visual data, enabling the features that nowadays require the human pilot’s eyes and visual cortex.” The existing AirRails system relies on laser- and radar-based distance sensors for accurate landing or terrain following. The partners said this will be enhanced by adding visual sensors, which will bring benefits such as avoiding the need to pre-mark landing spots and the ability to operate in a natural environment and recognize dynamic obstacles on the ground.
Magpie, which has been in development and flight testing since February 2018, will provide real-time vision-based detection of emergency landing locations. It will also deliver vision-based navigation and attitude estimation in situations when GPS guidance is not available. The hardware weighs less than one pound (500 grams).
Separately, Daedalean has selected the Unigine 3D Engine virtual environment simulation platform to support of its AI autopilot. It will use the system to train neural networks to perform the visual cortex functions needed for the autopilot. According to Daedalean CEO Luuk van Dijk, this process would otherwise have required an unfeasible number of flights to train the neural networks to deal with every conceivable scenario.