Thursday, September 10, 2015

2015-09-09: Autonomy Incubator Launches #MeetOurTeam

As the Autonomy Incubator settles down from weeks of nonstop demonstrations, we are launching a series of short videos called #MeetOurTeam. Designed to introduce our multi-dimensional team, we hope to convey the unique importance of the skills each member brings to the project.

The Autonomy Incubator team consists of researchers from a variety of disciplines. We refer to our lead researches as Principal Investigators (PIs), and they have backgrounds as varied as computer science, robotics, electrical engineering, mechanical engineering, aerospace engineering, psychology, machine vision, and machine learning. As a cohesive unit, our team designs autonomous systems that have the self-contained ability to execute a mission without direct human involvement. The applications for this technology are vast, and so we focus on a wide range of projects within autonomy.

To be successful, we are each of us dependent upon the hard work of our teammates. For example, Anna, our human factors specialist, focuses on the interactions between our autonomous systems and users, and she seeks to make the machine controls intuitive and easy for a user to implement. This has been instrumental in projects designed to be operated by people without a technical background. Meghan's work on gesture-based controls of UAVs falls under this category, and among other things her work has proven exceptional in our outreach as guests to our lab have been able to fly UAVs with a series of simple hand gestures. Loc works on obstacle avoidance and computer vision, which is vital to the research Jim is doing with the Mars Flyer, while Paul's avionics research significantly benefits our more difficult mission and has facilitated our flight tests, and so on and so on. Aside from researchers, our team is supported by qualified UAS pilots, flight safety personnel, range safety officers, technicians and administrators.

Stay tuned for more, and we look forward to having you #MeetOurTeam at the Autonomy Incubator.

Monday, September 7, 2015

2015-09-03: Autonomy Incubator Student Exit Presentation - Meghan Chandarana

On August 27th, our last Autonomy Incubator Summer 2015 Student Intern completed the set of NIFS exit presentations documenting and demoing their research results. Meghan Chandarana, a PhD candidate at Carnegie Mellon University, reported on her work on gesture-based controls of UAVs.



Meghan Chandarana presents
"Gesture-based Natural Language Interface for a UAV Ground Control System"

Tuesday, September 1, 2015

2015-09-01: Autonomy Incubator Hosts NASA TAC Program Director Doug Rohn and Deputy Director Richard Barhydt


The Autonomy Incubator hosted a visit today from NASA’s Aeronautics Research Mission Directorate (ARMD). Transformative Aeronautics Concepts (TAC) Program Director, Doug Rohn, and Deputy Director Richard Barhydt, stopped by the AI facility for an overview of the research being done, a demonstration of our current applications and a discussion of the importance of further developing our work to meet the needs of future aeronautics missions.

NASA ARMD TAC Director Douglas Rohn and Deputy Director Richard Barhydt examine
an autonomous micro UAV, shown in flight in the video below.
video

The micro UAV in flight (beginning at the left of the screen)

Although most of our missions utilize small UAVs, the inclusion of micro UAVs in our vehicle portfolio shows the capacity to transfer this technology to platforms of different sizes and functions. For example, when we look into the potential of this technology for delivering packages or collecting data in cluttered environments, this versatility is important as not all packages and payloads will be uniform in weight and size and some environments will demand very small vehicles.

Director Rohn and Deputy Director Barhydt were also interested in the AI's research in localization without the aid of external data such as GPS, as the applications for such technology extend to places on Earth as well as in space. To demonstrate the work our team is doing in this regard, Jim Neilan and Paul Rothhaar explained the AI's role in integrating visual odometry onto the Mars Flyer. Given the twenty minute communication latency from Earth to Mars, vehicles on Mars must be able to reliably navigate without human control as NASA explores the red planet and collects data. This requires the autonomous capabilities of obstacle detection and avoidance in order to pilot its course and to position itself in a data-deprived environment.