Monday, August 3, 2015

2015-08-03: Autonomy Incubator Tests Project Tango-Equipped UAV

The quadrotor takes the Tango for its first spin around the flight range. 

Corin Sandford, an intern with Autonomy Incubator (AI) member Gary Qualls, who joined us in Building 1222 for the later half of his internship, test flew a custom-built quadrotor carrying Google's Project Tango tablet in our GPS-emulation area today. The Tango tablet, currently only available as a developers kit, is a tablet that uses a combination of IMU (Inertial Measurement Unit), RGB camera, infrared camera, and infrared projector point cloud data to map and navigate through its surroundings. Corin's work explores the potential for using the tablet as a mapping and navigation system for autonomous unmanned vehicles, a problem many of the AI's engineers are also working to to solve—think of PI Loc Tran's work with MSCKF navigation, Alex's SVO research, and the Kalman filter research Josh and Mike have done for Loc and PI Jim Neilan's investigations into sensor fusion.

The tablet clips into a case on the bottom of the UAV.

Corin's project for the summer seems straightforward at first: find out if and how the Tango tablet could be useful to NASA's research into autonomous, GPS-deprived navigation. However, the Tango presents some interesting possibilities and challenges, especially because of the unique way it localizes itself. Instead of finding features in real time and localizing itself as it goes along, for example, the Tango has a 'learning mode" that requires the user to carry or fly it around the environment beforehand, so that it can make a map of the surroundings. Once it's created the map, the tablet uses it as a ground truth for navigation during the mission.

"It localizes against the map you created in learning mode," Corin explained. "It's trying to match the visual odometry to the map... if you find a point where you know where you are, you can stop averaging [sensor data] and start over." Essentially, while it's in learning mode, it records its position in "poses" with x,y, and z coordinates and a rotation quaternion per frame of video, and whenever a pose from its memory matches a pose in real time, it re-localizes itself and starts the algorithm over. By constantly updating the program's point of reference, this method cuts down on drift and allows for more precise positioning.

Corin "flies" the UAV around by hand before the flight test.

Corin finds that while the tablet's methods give it an excellent idea of where it is in space, those same methods create challenges for using it aboard an autonomous vehicle in unknown surroundings. Its dependence on comparisons between its internal map and the real-time environment mean that it cannot detect and avoid moving obstacles; in addition, its infrared projector does not function in daylight, limiting its use in outdoor missions like search-and-rescue.

While he emphasized that the Tango was not built to have some functionalities, like obstacle avoidance—"Just because it sees a wall, it doesn't know not to run into it"—he remained enthusiastic about what the tablet can do.

"Here in the Autonomy Incubator, it could be used for path projection," he said.

Although his project ends this week, Corin has an exciting new life waiting for him outside NASA Langley's gates: he'll be attending CU Boulder to begin a Computer Science PhD in the fall, which will also allow him to work in the university's Autonomous Robotics and Perceptions Group (ARPG) lab.

No comments:

Post a Comment