Embry-Riddle, Eclipse Aerospace Develop AI Tool to Reduce Pilot Workload in Aviation Radio Communications

Embry-Riddle and Eclipse Aerospace Wordmarks
Embry-Riddle and Eclipse Aerospace have partnered on an AI tool that has the potential to improve pilots’ situational awareness and reduce their workload by streamlining cockpit tasks.

Embry-Riddle Aeronautical University and Eclipse Aerospace have partnered on a project to develop a tool that uses artificial intelligence technology to reduce pilot workload in aviation radio communication.

The tool employs AI capabilities in automatic speech recognition and natural language processing to capture voice transmissions and extract critical information from air traffic control (ATC) communications. It has the potential to improve pilots’ situational awareness and reduce their workload by streamlining cockpit tasks.

Eclipse Aerospace is funding the project and supporting the research with its Flight Test infrastructure.

“A safer aviation future requires lighter pilot workloads — and AI is central to that future,” said Jeffrey Rochelle, executive vice president at Eclipse Aerospace. “Eclipse Aerospace is dedicated to accelerating AI innovations that enhance safety in the cockpit.”

Dr. Andrew Schneider, an assistant professor and director of Flight Research in the College of Aviation, and Dr. Jianhua Liu, associate professor of Electrical and Computer Engineering in the College of Engineering, are co-principal investigators on the project.

“We are thrilled to collaborate with Eclipse Aerospace in advancing AI speech recognition to enhance aircraft safety and reduce pilot workload,” Schneider said.

The AI-driven system works by listening to and processing pilot-to-ATC communications. Through automatic speech recognition (ASR) and natural language processing (NLP) — two major components of AI that can interpret and respond to human speech — it captures specific instructions and commands, such as heading, flight level, speed, frequency and squawk code. These key elements are extracted and displayed, after which the pilot can decide whether to send that command to the avionics.

“Our goal is to examine the feasibility of implementing this technology within a real-time, flight deck environment,” Schneider said.

Challenges include cockpit noise and radio signal issues, as well as linguistic flexibility when pilots stray from standard phraseology. 

“We need to make sure that the system can parse out these instructions in a really messy environment,” Schneider added.

The team, which has been testing the tool in both lab and flight environments, has achieved highly accurate, real-time automatic speech recognition and is on track with its natural language processing work. The project has been supported by Ph.D. student Shital Pandey, master’s student Sai Preethi Kunjeti and undergraduate Avery Cuenin, who are advised by Liu.

Another of the project’s main goals is to evaluate pilot workload reduction when using the tool. With support from Ph.D. student Elizabeth Merwin, a human factors study will measure pilot performance, workload, trust in automation and trust in AI at the initial interface.

Additionally, the researchers will study how pilots interact with AI technology and how these interactions affect pilot performance.

“This collaboration with Eclipse allows us to explore how human–AI teaming can strengthen decision-making on the flight deck,” Schneider said. “We’re excited about the potential of this technology to evolve into a trusted tool that improves situational awareness and overall pilot performance.”