Tuesday, July 26, 2016

2016-07-26: Autonomy Incubator Makes Data-Denied Breakthrough




Yesterday, the Autonomy Incubator (Ai) team assembled in the flight range to watch history being made: the Ai's first completely data-denied autonomous flight. Just a quadrotor, a camera, and a visual odometry algorithm keeping the whole thing on path. No positioning system in sight (literally and figuratively!).

"What was interesting is that yesterday, we had no GPS or [indoor GPS emulator] Vicon™ to help us. It was just the visual odometry, and it handled very well," PI Jim Neilan said. Jim has been working on a data-denied navigation solution with the Ai for years, and yesterday's success was a massive validation for him and his team.

Here's the quick-and-dirty on how the Ai does visual odometry. We use a downward-facing global shutter camera to collect information on where the UAV is. The "global shutter" bit is really key here— most cameras have rolling shutters, which means that only one row of pixels gets exposed to light at a time. Rolling shutters are fine for most things, but when they're in motion, they cause a lot of aliasing and wobbling that makes visual odometry next to impossible. A global shutter camera exposes the entire frame of the picture at once, making for a faster, more reliable source of visual data.

"I'd say it's around forty to fifty frames per second, on average," Jim said. "We're using a very powerful global shutter camera."

The data from the camera (as well as from other sensors, like IMUs and barometers) gets fed into an algorithm called PTAM: Parallel Tracking And Mapping.

"It's actually based on an augmented reality thing designed for cell phones in 2008 by some guys at Oxford," Jim said. The basic idea behind PTAM is creating a grid map of the environment (the mapping) and then using translations to track where the camera moves in that environment (the tracking). These things happen simultaneously, so they're parallel. See what I'm saying? Here's the original paper if you're intrigued.

An aside: augmented reality is also the thing that lets Pokémon Go put little anime animals into your surroundings. So, the next time you throw a Poké Ball at a Charmander on your kitchen table, remember you're using the same technology that's revolutionizing autonomous flight!

We've been using PTAM for a while in both our indoor and outdoor tests, but yesterday's test was exciting because there was no external data source to correct the drift in the algorithm, and it still performed beautifully. Watch the video; doesn't that flight path look buttery smooth? Personally, I didn't realize that they'd switched the algorithm on until I looked over at Jim and saw his hands weren't moving the controls.

With a successful first flight in the logbooks, Jim says they have three concrete goals moving forward.

"We have to do a couple things. We need to clean the code up and make it robust to failure. We have to put hooks in the code so we can inject corrections periodically. And we need to fly it outside."

Stay tuned!


No comments:

Post a Comment