NMSU research on wearable computers may save lives of first responders

The next generation of drone controls is the focus of new research at New Mexico State University, thanks to a nearly $500,000 Cyber Human Systems award from the National Science Foundation.

Three NMSU scientists in the College of Arts and Sciences – Zachary O. Toups and Son Tran, computer science professors, and Igor Dolgov, psychology professor – are working together on the grant. The aim is to consider how wearable computers can support urban search and rescue contexts as science advances to move from multiple humans piloting one drone to one human directing many drones.

“With this project, my hope is that we can really impact future disaster response practice and employ games for design, rather than training,” says Toups, the principal investigator for the grant. Tran and Dolgov are co-principal investigators.

The project uses simulated drones supporting game players moving in the physical world as a way to design these wearable interfaces.

Their experiments over the next three years could someday lead to wearable computer equipment that would allow urban search and rescue teams in disaster zones to direct multiple drones with the wave of a hand or a few taps on a wrist device.

“Wearable systems don’t lend themselves to complex controls because they have to fit on the body and be accessible,” Toups says, adding that as the drones become smarter, they can have less input from the humans, which opens up the possibilities for the type of wearable interface the NMSU team of scientists is designing.

“You can imagine someone in treacherous terrain who needs to work with drones but needs to have their hands free in the environment to move safely,” Toups says. “Ideally, you envision the scenario where a group of drones provides intel but is smart about when they bother the human user.”

Right now, operating drones in U.S. airspace is a complex process, which requires more than one person operating each drone. Wearable interfaces currently available are often used for training and virtual reality games, but not to direct live operations.

With the NSF grant, Toups, Son and Dolgov will begin building and testing wearable cyber-human system designs that use virtual drones simulating adaptive autonomy in a video game environment.

“This grant is about studying the systems using games,” Toups says. “We want to make something that is compelling and interesting that people want to play with, and from that experience, we learn how best to use it.”