Wednesday, August 31, 2016

2016-08-31: Meghan Chandarana and Erica Meszaros Summer 2016 Exit Presentation

Meghan Chandarana is a PhD student in mechanical engineering from Carnegie Mellon University in Pittsburgh who first joined the Ai as an intern last summer. Erica Meszaros, who has been with the Ai as an intern since January, recently received a Masters from Eastern Michigan University and is currently continuing her studies at the University of Chicago. Together, they presented the results of their research developing a natural language interface for non-expert autonomous UAV management as part of this summer's HOLII GRAILLE.

Meghan created a gesture node for the system, while Erica created a speech node, both of which could create flight commands independently or in cooperation with each other. In their user study, the results of the speech and gesture nodes were compared against user performance on a mouse-based interface– something they knew subjects would be familiar with and comfortable using.

The results of their user study were promising– as you'll see– and both Meghan and Erica foresee advancements and applications for this summer's research.

Wednesday, August 10, 2016

2016-08-10: Gale Curry Summer 2016 Exit Presentation

Gale Curry from Northwestern University focused on flocking and formation flight during her summer with the Autonomy Incubator. She was challenged to enable a fleet of micro UAVs fly through an aperture, like a window, and then reassemble on the other side. Equipping teams of small vehicles with this kind of adaptability and agility in the air could support search and rescue, earth science data collection, and disaster assessment - just to name a few missions!

Gale may be rejoining the AI team in January and we are excited to see what amazing things she does next!

2016-08-10: Kastan Day Summer 2016 Exit Presentation

Kastan Day, the videography half of the Ai's social media powerhouse (ahem ahem), presented on his accomplishments and milestones from the summer. His highlights included creating a lower-third animation for introducing interview subjects, establishing our YouTube channel, and, in his words, "telling the story of the Ai." Kastan and his videos enriched the Ai's web content and freed up my writing in ways I never expected— he was a driving force in making this summer the most popular period in the history of the Ai's social media presence. He intends to study artificial intelligence as a freshman at Swarthmore College this fall.

2016-08-10: Angelica Garcia Summer 2016 Exit Presentation

Angelica, a Masters student at the University of Central Florida, came to the Ai with the goal of incorporating virtual reality into our user interface pipeline. As one of the stars of the HOLII GRAILLE team,

2016-08-10: Jeremy Lim Summer 2016 Exit Presentation

Jeremy Lim presented his work as the systems integration specialist for the HOLII GRAILLE team to wrap up his second internship at the Ai. He provided an in-depth explanation of the HOLII GRAILLE project before describing how he integrated the widely different elements of the demo— gesture recognition, trajectory generation, virtual reality, and autonomous flight— into one continuous pipeline using our in-house network, AEON (Autonomous Entities Operations Network) and DDS (Data Distribution Service).

2016-08-10: Lauren Howell Summer 2016 Exit Presentation

Lauren Howell, an Aero Scholar joining us from the University of Alabama, ended her second summer at the Ai with a presentation on her risk assessment and trajectory generation research. An expansion on her work with Bezier curves from last summer, Lauren's research this summer had her working in tandem with Javier Puig Navarro as the HOLII GRAILLE specialists on trajectory generation and collision avoidance. Highlights of her accomplishments included using Pythagorean hodographs to augment her trajectory generation algorithm and traveling to the Netherlands to attend the Airbus Airnovation conference.

2016-08-10: Josh Eddy Summer 2016 Exit Presentation

Josh, a Masters student at his alma mater Virginia Tech, presented on his research in visual-inertial navigation and state estimation in GPS-denied environments using unscented Kalman filters. If you want a brilliant yet accessible explanation of state estimation, Kalman filtering, or GPS-denied navigation, watch this video. As I write this, Meghan, Nick, and Gale are talking about what an effective teacher Josh is.

"I don't feel like I just watched a half-hour lecture on Kalman filtering," Nick said.

"Yeah, he'd be a great professor," Meghan added.

Josh did super cool, innovative work and he's great at explaining it. We're going to miss him and his stage presence a lot around here.

2016-08-10: Deegan Atha Summer 2016 Exit Presentation

Deegan Atha, a rising senior at Purdue University, wrapped up his second semester in the Ai with a presentation on his computer vision research. Deegan's most prominent accomplishment is 3DEEGAN, the object detection and classification system he created to run onboard the Ai's UAVs and allow them to instantaneously survey and recognize their surroundings. He used a convolutional neural network to detect objects in the field of view, and even trained it on a homemade image library of pictures he took of the Ai's vehicles and people.

2016-08-10: Jacob Beck Summer 2016 Exit Presentation

Jacob Beck leaves the Autonomy Incubator to begin pursuing his PhD in robotics at Oregon State University. During his time here, he became known for his dazzling demonstrations of his soft gripper project and his Spider Bot autonomous mobile gripper. In addition, he and intern Nick Woodward collaborated on the design for the package delivery system currently in use on our Orevkon drone for our outdoor science mission.

2016-08-10: Javier Puig-Navarro Summer 2016 Exit Presentation

Javier ended his third summer with the Ai in triumph not only in his individual research, but also as a vital part of the HOLII GRAILLE demo. Javier devotes most of his efforts into avionics trajectories— generating them, coordinating them between multiple vehicles, checking them for collisions, replanning them to avoid collisions, and then mathematically proving them correct. We're grateful to benefit from his expertise, and we can't wait to have him back in the Ai next time!

Wednesday, August 3, 2016

2106-08-03: NASA LaRC Hosts AIAA Intelligent Systems and Technology Conference, Day 1

Director Crisp, Dr. Simmons, and Dr. Casbeer listen as Dr. Borener answers a question.

NASA LaRC is thrilled to host the three-day AIAA ISTC Workshop this week, where we've welcomed guests from all over the world to come discuss intelligent systems and autonomy in aeronautics. The workshop was off to an auspicious start with a keynote on Unmanned Disaster Response from Dr. Robin Murphy followed by panel on "Lessons Learned from Government Agencies" featuring LaRC Director of Systems Analysis and Concepts, Vicki Crisp, and moderated by our own fearless leader, Ai head Dr.Danette Allen.

Danette announces that the panel is open for questions.

The four-member panel consisted of Director Crisp from NASA, Dr. David Casbeer from the Air Force Research Laboratory (AFRL), Dr. Sherry Borener from the Federal Aviation Administration (FAA), and Dr. Reid Simmons from the National Science Foundation (NSF). Each of them presented on aspects of autonomy and flight from their agencies' perspectives before Danette opened the floor to questions from the audience.

Hearing the differences in approaches to autonomy between agencies was illuminating, and made for an excellent discussion. For example, Dr. Borener's comments focused on the importance of developing communication abilities for unmanned vehicles in the national airspace.

"The problem is that it has to blend into the airspace with other, manned vehicles," she said.

Dr. Borener makes a point about flight safety and autonomous aircraft in the NAS.

By contrast, Dr. Casbeer emphasized the continuing relevance of controls research even as autonomy becomes more advanced. While some in aviation view autonomy as a replacement for controls, he argues that controls must and will remain important for autonomy to succeed, especially at the inner loop, on the lowest levels of programming.

"[People] say control is dead... it's like a spoon in the heart," he joked.

Dr. Casbeer addresses the audience.

The discussion also expanded beyond the purely scientific and into the larger constructs surrounding autonomy research. Director Crisp heard a question from an audience member about the federal government's hesitance to fund what they call "duplication studies" even though replication is a key part of the scientific process, which she answered from her experience as a "self-proclaimed bureaucrat."

"That's a great question, by the way. We're having these conversations not only within our own agency, but with Congress, OMB, and others," she said. "It's them asking, 'Explain to me why this is important'... We're trying to help inform and educate."

Director Crisp offers insight into communicating NASA's goals across the federal government.

Finally, at the end of the panel, Danette asked a "lightning round question" that she had been asked herself at the DoD HFE TAG held recently at NASA LaRC: "If you were king or queen, and had all the money in the world, what problem in autonomy would you solve?"

Dr. Reid had an immediate answer: "My tall pole would be having intelligent systems understand their limitations."

2016-08-01: Autonomy Incubator Interns Join Forces, Create Autonomous Spider Bot

Deegan Atha and Jacob Beck rehearse for their new spot in our demo lineup.

You've seen 3DEEGAN. You've seen Spider Bot. Now, two of the Autonomy Incubator's (Ai) hottest technologies join forces to take package delivery where it's never gone before: an autonomous mobile gripper equipped with computer vision and object tracking. It's the smart, lovable robot spider of the future.

Look at this friendly little guy find and hug that ball!

First of all, if you haven't read about Deegan's and Jacob's work, I'd suggest following the links at the top of the article to catch up— according to Jacob, "the best way to understand what we're doing is to talk about the individual projects first." Although his work and Deegan's are in disparate fields, Jacob emphasizes that the computer vision software and the Spider Bot were actually very easy to integrate. 

"Getting our software to talk to each other wasn't all that difficult," he said. "Deegan's sends mine the coordinates and then the robot acts on them."

Jacob tunes up the Spider Bot between tests.

"The neural net is still running on the same computer, with the lidar and everything. We're just switching the video feed to the Spider Bot," Deegan added. His software runs on a computer onboard the UAV, and uses a forward-facing camera during regular operation. For Spider Bot operations, the video feed will switch to the on-gripper camera.

"We're just changing the view of the neural network," he said.

Deegan gets the lidar and computer vision algorithm ready to go.

The main objective of integrating Deegan's software with Jacob's gripper is to enhance the level of autonomy the Spider Bot has, therefore taking many of the burdens in package delivery off of the UAV itself. Right now, picking up a package involves challenges with precision landing— challenges that are often met with workarounds like infrared sensors but that's adding structure to the environment which is something that we don't want to do in the AI. An autonomous mobile gripper changes a lot about how we approach package pick-up.

"One of the things with the vision system is that you don't need any visual markers. And with this, we don't need precision landing," Deegan explained. Once deployed, the Spider Bot gripper will find the package, lock onto it and move towards it, then lift its front legs and grab the package before winching itself back up to the UAV. 

A Spider Bot's eye view of the author's feet. Luckily, it didn't grab me.

"It looks like a hunting spider; did you draw inspiration from real spiders for this?" I asked Jacob after he and Deegan wrapped up today.

"I guess it does! Kind of, yeah," he said. "I didn't look at pictures or anything, but it does work like that."