Tuesday, July 28, 2015

2015-07-28: Autonomy Incubator's Meghan Chandarana Makes Everyone A UAV Pilot

Meghan, right, demonstrates a gesture for Lauren.

In her six short weeks at the Autonomy Incubator (AI) so far, intern and Carnegie Mellon University PhD candidate Meghan Chandarana has opened an new realm of possibilities in the AI's control research through her work with gesture-based controls. Using her program and an infrared hand sensor, any user can use a dictionary of twelve simple gestures to create a trajectory for a UAV to fly, confirm it, then tell the vehicle to take off.  In short, generating paths for autonomous UAVs is currently a task for specialized engineers in robotics labs (like us), but in the very near future, anyone with fine motor skills could be doing this in their backyard with just a laptop and an off-the-shelf sensor.

This week, her work faces its largest challenge yet as her program integrates with Lauren Howell's spline generator and Bilal and Javier's controls system for Lauren's exit demo.  The gestures from the user provide waypoints for Lauren's program, which generates a trajectory spline and feeds it to Javier and Bilal's path-following program, which communicates the path to the UAV.  It's an unprecedented combination of this summer's research; in addition, this demo marks the first time anyone othis than Meghan has used her gesture recognition program in the flight range. After practice with Meghan, Lauren will be the one to set the path and take off the vehicle on Thursday. 

"The system is not trained on any one user. The fact that Lauren can come in and teach a trajectory and have the system understand it is awesome," Meghan said.

Meghan and Lauren beam proudly during the maiden voyage of their demo.

While her existing program is already compelling, Meghan has even more sophisticated things planned for the future. Her next step, she says, will be to allow the user more control over the trajectory they construct.

"Right now, it's a pre-set length for every path. Eventually, you would first do a gesture, and then the system would say, 'Hey, you did a spiral, how long do you want that to be?' Then you'd use another gesture to specify the length or the radius of a circle," she said.

Beyond the immediate, Meghan's ultimate goal for the project is to incorporate neural nets and integrate machine learning into her program, then train it on a variety of users.  The more versions of the "down" gesture it sees, for example, the more nuanced its understanding of "down" will become, and the more robust its functionality will be. 

"We want multiple people to come in and teach the system, 'What is a spiral?'" she said.

Meghan's excitement and dedication to her project is contagious; the entire AI comes out to see her tests whenever she flies. Other interns have even started writing programs for the sensor she uses; intern Gil Montague wrote a controller that uses the angle of the user's outstretched hand to maneuver a UAV around. It's so easy to use that we're setting up for the kids at NASA Langley's Youth Day to play with tomorrow.

Meghan demonstrates another kid-friendly aspect of gesture control, which is that it frees up your other hand to hold a freezie pop. Nick Woodward, enraptured, looks on.







1 comment: