Tuesday, July 5, 2016

2016-07-05: Autonomy Incubator Begins Tests of New Precision Landing Abilities

Ben Kelley, Loc Tran, Matt Burkhardt, and Kyle McQuarry prepare to initialize.

A new frontier of the Autonomy Incubator's (Ai) research began today as PI Loc Tran and NASA Space and Technology Research Fellow Matt Burkhardt started running their control software on a vehicle in real time. While Matt has worked on varied projects so far this summer, this application of his controls research has become his primary focus.

"I'm supporting the most pressing of challenges, which is precision landing," he said.

What is precision landing, exactly? It's the ability of a UAV to find its landing point, center its body over that point, and land there— precisely where it's supposed to land. An example of  a commercially available solution is a product called IR-LOCK™, which uses an infrared beacon on the ground and an infrared camera on the vehicle to facilitate precise landing. It works well, but this is the Ai: we want "unstructured" solutions. Between Loc's visual odometry work and Matt's control algorithm, our vehicles will soon be able to execute precise landings using the input from just regular onboard cameras.

"What we're attempting to do here is replicate the behavior, but eliminate the beacon," Matt said. Eliminating the beacon (or any fiducial, for that matter) is what we mean by "unstructured". We can't rely on being able to mark the environment in any way to assist in our autonomous operations.

Matt and Jim Neilan perform hardware checks before testing.

Our new autonomous precision landing abilities will have immediate applications in the Ai's sensor package delivery mission.

"In the second phase of the mission, we're going to try to go back to launch without GPS," Loc explained. "The idea is, when we take off, we'll take a picture of the package, so that when we come back, we can look for that exact spot [and land]. We call it the natural feature tracker."

I have some screengrabs from Loc's algorithm at work; take a look. He ran an example image for us: the top one is the reference picture with no alterations, and the bottom one has been shifted around to mimic what a UAV might see as it comes in to land. The algorithm looked at the bottom image and provided directional suggestions to make it line up with the reference picture— in this case, it suggested that we move left and up a bit, plus yaw a little clockwise.

The reference image, taken from the onboard camera...

...and what the UAV might see as it makes its approach. 

Loc's natural feature tracker (NFT) is only one half of the equation, however. Matt's control algorithm takes the output from the feature tracker and uses it to autonomously guide the vehicle into position.

"The challenge is, given an image like this, what do we tell the vehicle to do?" Matt said. "My controller has to take these images, track the feature, and apply force and torque to the vehicle."

For instance, in the example above, Matt's controller would take the feature tracker's recommendation to go left and up a bit, and then manipulate the vehicle to actually move it left and up a bit. Make sense? Loc's software provides the impulse; Matt's software provides the action. Together, they make a precision-landing powerhouse.

In today's tests, Loc and Matt wanted to hand-fly a quadrotor while their software ran in the background— not flying autonomously yet, but still generating output for the researchers to review for accuracy. However, they had some obstacles to overcome first.

"We need to take the [reference picture] in order to see if the algorithm works, but with someone holding it, their feet were always in the picture. And we can't have that," Matt said. Which led to this in the flight range:

PI Ralph Williams makes a hardware adjustment.

The ingenuity of the engineers here at NASA Langley Research Center cannot be stifled. Suspended as though hovering about five feet above the ground, the UAV took an accurate, feet-free picture to use in feature matching and tracking. Ralph did the honors of hand-flying the UAV around the range while Matt observed his algorithm's output.


Now, with a few rounds of successful hand-flying tests on the logbooks, Matt and Loc intend to move to remote-controlled indoor flights for their next step. We'll keep you posted on those and any other exciting developments in this mission right here on the Ai blog.


No comments:

Post a Comment