Using fly-based signal processing to listen for drone aircraft
Small uncrewed aircraft are growing more common in all aviation sectors, and it is increasingly important to be able to detect them. One of the most common methods to detect unknown aircraft is acoustically, with a microphone or series of microphones. However, uncrewed craft are generally small and difficult to spot, so this is not an easy task. In addition, at long range the sound from the aircraft mixes with background noise and becomes difficult to distinguish.
Fang et al. developed a method to acoustically detect small aircraft using an algorithm developed from the vision processing system of a hoverfly.
“We applied the principles of the hoverfly’s vision system to detect very, very quiet targets, in this case, drones, at very long ranges,” said author Anthony Finn.
The hoverfly’s vision system is ideal for a handful of reasons. It is simple to study and understand, and like most insects, the vision system is pixel based. Importantly, the hoverfly vision system excels at picking out faint signals amidst noise. The team translated the hoverfly vision system into a digital algorithm, which they have used in the past to identify visual targets.
In this case, the researchers adapted their algorithm to identify acoustic signals by applying a frequency transform to collected sound data and arranging it into a two-dimensional image. They then processed that image with their algorithm to detect distant drone aircraft. Their system was able to identify objects up to 50% farther than traditional methods.
The researchers are exploring more practical applications of this technology.
“We would like to be able to transition this research into the real world,” said Finn. “We are trying very hard to do that.”
Source: “Acoustic detection of unmanned aerial vehicles using biologically inspired vision processing,” by Jian Fang, Anthony Finn, Ron Wyber, and Russell S. A. Brinkworth, The Journal of the Acoustical Society of America (2022). The article can be accessed at https://doi.org/10.1121/10.0009350 .