Say Again: Embry-Riddle Researchers Develop AI System to Decode Aviation Speak

Embry-Riddle researchers Dr. Jianhua Liu and Andrew Schneider with lead flight supervisor Chris Deputy (left) in the Flight Supervisor Tower
Embry-Riddle researchers Dr. Jianhua Liu and Andrew Schneider are collaborating on a project that uses artificial intelligence to improve the clarity and safety of radio communications between pilots and air traffic controllers. Here, they are pictured with lead flight supervisor Chris Deputy (left) in the Flight Supervisor Tower. (Photo: Embry-Riddle/Melanie Stawicki Azam)

Pilots and air traffic controllers have long faced challenges with radio communications — whether from high cockpit noise, weak transmissions or misinterpreted aviation jargon.

To help them communicate more clearly and accurately, Embry-Riddle Aeronautical University researchers have developed a system that uses artificial intelligence to transcribe and translate aviation radio communications.

While other areas of aviation have undergone automation and technological advancements, radio communication has largely remained unchanged, said Andrew Schneider, an assistant professor in the College of Aviation who directs the university’s Speech and Language AI Lab.

The system uses automatic speech recognition to convert spoken radio transmissions into text. Natural language processing interprets and refines that text by standardizing terminology, formatting spoken numbers and call signs, removing filler words and flagging potential errors.

Allowing for a large-scale analysis of pilot-controller communications, the system could reveal patterns, phraseology errors and safety concerns that were previously difficult to study, Schneider said.

“We see an opportunity here,” he said, “for another leap forward to help controllers and pilots have safer radio communication.”

In the future, the system could also be used to provide immediate feedback to student pilots and help instructors better target communication issues.

Schneider is collaborating with Dr. Jianhua Liu, associate professor of Electrical and Computer Engineering, to build the AI-driven system.

“With this research, we can improve the efficiency of communications and reduce errors,” said Liu, who has a background in machine learning.

Schneider and Liu’s research has received two grants totaling $30,000 from Embry-Riddle’s Boeing Center for Aviation and Aerospace Safety, including a joint College of Engineering grant.

“We simply couldn’t have launched this work without that support — it enabled us to move from concept to reality,” Schneider said.

Embry-Riddle researchers Dr. Jianhua Liu and Andrew Schneider
Schneider and Liu have developed a system that transcribes aviation radio communications speech with high accuracy. (Photo: Embry-Riddle/Melanie Stawicki Azam)

Their research — first supported in 2023 with a $15,000 grant — began with the collection of radio communication recordings from 12 high-traffic U.S. airports. The audio was fed into automatic speech recognition tools, such as OpenAI’s Whisper, to create transcriptions. The off-the-shelf models, however, averaged an 80 percent word error rate, highlighting the need for aviation-specific tools, said Schneider, a third‑generation pilot and linguist.

“Aviation English isn’t standard conversational grammar — it’s a condensed, highly specific phraseology spoken over a noisy radio where words get clipped and specialized jargon abounds, he said.

Liu used his expertise in signal processing, which involves the conversion of analog signals into digital data, to customize an automatic speech recognition tool that dramatically reduced the word error rate from 80% to less than 15%.

The system performed so well that it was used in a NASA-funded project, where information from flight deck communications needed to be harvested from audio with high background noise.

“In aviation, we have done a great job with using numerical data, but until now we haven’t had the tools to use qualitative data at the same scale,” said Dr. Kristy Kiernan, associate director of the Boeing Center for Aviation and Aerospace Safety and the lead investigator on the NASA project. “Large language models can open up whole new data sources that we can leverage to improve safety. That’s really exciting.”

At the university’s Second Annual Safety Research Symposium earlier this year, Liu and Schneider shared their initial findings in a presentation titled “Monitoring ATC Communication Issues With ASR and NLP.” They also delivered a presentation, “Showcasing Speech and Language AI (SaLAI) in Aviation and Aerospace Applications,” at the university’s AI Summit this past fall.

“Their project, developing automatic speech recognition for aviation applications, has many potential use cases,” said Kiernan.

Looking ahead, Schneider and Liu said they are developing real-time applications where the system would interface with aircraft systems to help detect inconsistencies between verbal instructions and aircraft behavior, flag missed calls or assist with checklist verification. Such a system could serve as a smart co-pilot, enhancing situational awareness and preventing communication breakdowns before they escalate.

Research assistant Sung Jun “Kevin” Cho, who recently graduated from Embry-Riddle with a Bachelor of Science in Aeronautical Science and has worked on the project, said the potential for AI technology to enhance aviation safety extends to “the real-time use of AI to correct our mistakes, ensuring efficiency and effectiveness of radio communications.”