Eyes in the Sky: Eagles Use Drones to Improve Tactical Response Against Active Shooters
Researchers at Embry-Riddle Aeronautical University are investigating the use of drones in active shooter scenarios, to help law enforcement more quickly identify assailants and their weapons.
“Our recommendations may assist public safety organizations looking to improve their best practices to minimize response times and potentially save the lives of victims,” said Dr. Joe Cerreta, associate professor of Aeronautical Science. He added that the research shows that near-infrared (NIR) drone-mounted sensors are more effective at detecting weapons than traditional red-green-blue (RGB) sensors.
The research, a collaboration involving Embry-Riddle’s Worldwide and Daytona Beach campuses, was funded by a Faculty Innovative Research in Science and Technology (FIRST) grant and will be published in The Police Journal: Theory, Practice and Principles.
According to the research abstract, 90% of mass shootings were over before law enforcement arrived at the scene. Furthermore, first responders were limited to a “surround and contain” response until a SWAT team arrived.
The researchers mounted camera sensors onto unmanned aircraft systems, or drones, such as the DJI Mavic 2 Enterprise, pictured.
“Using unmanned aircraft systems (UAS) to detect which individual was the threat and the type of weapons they used can provide useful information to increase the speed of the response for first-on-scene, rather than waiting for SWAT,” according to the abstract.
To conduct the research, 48 images of a simulated active threat — pictures of groups of people, with one or more of the people holding a weapon — were collected from different distances using NIR and RGB cameras mounted on drones. Some of the images contained no weapon. The images were presented to 102 participants, who were asked to point out any weapons they detected.
According to the results, a UAS with a NIR camera sensor had a 12% better rate than RGB of accurate weapon detection overall, which included a pistol, shotgun, rifle, knife and shovel. For the pistol, the rate of detection in NIR images was 33% better than in RGB. The results also indicated that the images gathered from 25 feet away showed a 42% improvement in ability to determine the weapon type over images gathered from 100 feet away.
Tray Denney, a sophomore in the Unmanned Systems Applications program, said participating in the research helped him to get hands-on experience and to understand how his background in software and data engineering could be applied to Unmanned Systems. He said doing research outside of class was extremely helpful to him.
“I got a better understanding and appreciation for how research is conducted,” Denney said. “No matter how great classes are, you don’t get to experience a lot of the nuances that you do when you are actually applying the knowledge outside of the classroom. I am better prepared for the work world now.”
Posted In: Aviation | Computers and Technology | Research | Security Intelligence and Safety | Uncrewed Systems