Wednesday, June 15, 2016

2016-06-15: Autonomy Incubator Conducts Outdoor Visual Odometry Test



The Autonomy Incubator (Ai) made a profound leap in progress for its package delivery objective last week with the first outdoor test of simulated package delivery. PIs Loc Tran, Jim Nielan, and Kyle McQuarry directed the test.

Pilot Zak Johns and PIs Jim Nielan, Loc Tran, and Kyle McQuarry celebrate.

We used one of our new quad-rotor UAVs, OG1 (short for OranGe 1) for the test in its maiden outdoor voyage. For safety reasons— we are testing research-level software on autonomous vehicles, after all— the flight was tethered.

OG1 takes to the sky (on a leash).
Wednesday's test was comprised of two short missions. The first mission, designed to test the software pipeline for the sensor payload (a package equipped with an ozone sensor), was a drop-off and pick-up. OG1 autonomously took off, navigated to the drop point using an onboard GPS, simulated landing and dropped off the sensor payload, took off again, took a picture of the payload, and then returned to base.

"We took the picture to recognize the area where we dropped the package," Jim Nielan explained.

Recognizing the target area became vital during the second phase, when the GPS data purposefully became "sporadic," and OG1 was left to navigate back to the sensor payload and retrieve it by relying on visual odometry.

The green pyramid on the bottom of OG1 is the sensor payload.

As far as test flights go, this one was unusual for our lab because of its heavy use of GPS. The Ai designs its UAVs to operate in GPS-deprived environments; it's one of the reasons we fly indoors so often. However, this test used global positioning data in the same way other tests use Vicon™ data: to establish a ground truth. The stored data from the drop-off flight, combined with the sporadic data during the pick-up flight, allowed the PIs to quantify how much drift was happening in the visual odometry algorithm.

The second mission focused entirely on the visual odometry pipeline, and involved just a simple fifteen-meter test flight from one GPS point to another using only visual odometry.

Loc and Kyle confer over the data.

About the results of the day's tests, Jim said that "The first part went very well, and the second part also proves that the visual odometry pipeline works and fails safely." Going forward, the Ai's goal is to create a more robust visual odometry algorithm for longer distances.

"Next test, we'll test 100 yards or more with visual odometry," Jim said.

Jim Nielan holds OG1 aloft in triumph.
The results of Wednesday's tests hold great promise not just for the Ai, but also for the entire scientific community: "There's only a very small number of research centers actually flying physical devices to make autonomy in national airspace safer and more reliable," Jim emphasized.

The live feed from OG1's onboard computer vision setup.


No comments:

Post a Comment