Meghan hails from Stockton, California, and completed her undergraduate studies in engineering at UC Berkley. She is also a veteran NASA intern; she spent last summer at Marshall Space FlightCenter (MSFC) in Alabama. Her work there, as well as her work on her thesis, reflects her specialty: controlling robots with gestures.
Yes, really— Meghan's work makes it possible for anyone to control a robot simply by holding his or her hand over a 3D sensor. For her project last year at Marshall, she programmed a robotic arm mounted to a CubeSat to mimic hand movements. Just look at this amazing demonstration she recorded to show her CMU lab mates:
It's like something out of a movie, right? But Meghan's just getting started—here at the AI, she wants to use the same techniques to teach UAVs new maneuvers.
"Let's say a quadcopter needs to turn sideways to fly through a gap between two trees," she said. "Instead of coding for hours, you could just show it the maneuver with your hand."
She's already got her 3D sensor up and running on her desktop, along with a demo that uses hand gestures to move a cube in three dimensional space on the screen. Her goal for this summer's project is to create a system of human-machine communication that's not only accurate, but natural to use.
"[Gesture-based controls] should be as effortless and intuitive as the way I'm talking to you now," she said: no sensor-laden gloves or specialized training needed.
After such an impressive first week and with such ambitious goals for the future, we can't wait to see what kind of exciting contributions Meghan brings to the Autonomy Incubator.
No comments:
Post a Comment