Team of Engineering, Computer Science Students Enhance Future of Remote Vehicles

Adam Berlier demonstrates new wearable technology system to operate remote vehicles while maintaining situational awareness.
Adam Berlier demonstrates new wearable technology system to operate remote vehicles while maintaining situational awareness. (Photo by David Massey)

Through technical ingenuity and interdisciplinary teamwork, a group of Embry-Riddle Aeronautical University students has discovered a new application of augmented reality that can potentially save lives.

Integrating a small unmanned aircraft system, an augmented reality (AR) headset and a neuromuscular gesture recognition armband, the students have developed a system that lets unmanned pilots safely operate remote vehicles while staying mobile and carrying out additional tasks. The system is a wearable technology that mimics natural human-to-human interaction, such as hand signals, for a human-to-vehicle interface.

“This technology is great for use by first responders and military,” said Adam Berlier, team lead for the project. “They do not need to set up an extensive computer control system because this small, mobile AR headpiece is a computer itself. Additionally, this AR device allows users to communicate with each other, eliminating the need for walkie talkies, and can be programmed to allow users to access only information they have clearance to see.”

Like most research, the team’s success didn’t happen overnight. For the past year, the students have been building on a project first developed by Embry-Riddle alumnus Jeremy Brown. In 2017, Brown received the People’s Choice award at the university’s Discovery Day for his inceptive project that employed neuromuscular control systems to operate a small unmanned aircraft system (sUAS).

“Jeremy’s research utilized a neuromuscular gesture recognition armband that reads tiny electrical impulses off your skin to determine which muscles were moving,” explained Berlier. “This allowed the system to know which hand gestures were made, and you could fly a sUAS from those gestures. Jeremy’s biggest concern was a lack of feedback, so he was unsure of the system’s accuracy and he wanted to gather additional information about the vehicles environment. That’s where we came in.”

Upon graduation, Brown passed the project down to Berlier to conduct the next phase of research. He assembled a unique, multi-disciplinary team of student engineers to join forces in this unprecedented project.

“Our team is particularly unique because we come from diverse backgrounds, including mechanical engineering, software engineering, electrical engineering and computer science,” explained Berlier. “This allowed our team to do so much more than just one discipline could have accomplished.”

A custom-designed ground vehicle provided strong analysis of dynamics and control of the vehicle. Combining a Microsoft HoloLens AR headset - which produces computer-generated content through a small, wearable headpiece – with the MYO gesture recognition armband from Brown’s research, the system allows users to retain their situational awareness while controlling the remote vehicle.

“The use of the HoloLens is an emerging technology,” explained team member Brandon Koury. “Integrating this with the MYO armband is a really new and developmental system.”

The AR screen allows users to see what is physically around them, while at the same time viewing video from the remote vehicle, GPS location, and the current gesture implemented by the device.

“In addition to first responders, this technology would be very useful for sUAS inspections of structures such as powerlines or wind turbines,” added Koury. “It is more intuitive and natural for use with unmanned vehicles and doesn’t require massive controls to operate.”

The team’s research has received rave reviews and garnered support from the Daytona Beach Campus Office of Undergraduate Research and mechanical engineering and electrical, computer, software and systems engineering departments.

The students have presented at various conferences including the Florida Undergraduate Research Conference (FURC), National Conference for Undergraduate Research (NCUR), and Embry-Riddle’s Discovery Day. At the American Society for Engineering Education (ASEE) Southeast Region Conference, the team won third place for design. The students will also present their research internationally this summer.

Similar to last year, the team plans to pass the torch to the next generation of Embry-Riddle students who will continue to develop and improve this meaningful technology. Incoming senior Charlie Pollock will take on the role as principal investigator, forming a team to create a centralized network and allow single users to control multiple vehicles.

As these student researchers prepare for graduation and life as Embry-Riddle Alumni, the team knows this project has not only challenged them educationally but has cultivated their collaborative spirit for their future careers.

“We were exposed to numerous situations and challenges that we never saw in a classroom environment,” shared Koury. “With our group’s diverse thinking, we had a wider view on how to handle these challenges than if we had all come from the same background.”