Friday, December 11, 2015

2015-12-11: Goodbye to our fall interns Deegan Atha and Rebecca Goodpasture

Today marks the last day for Deegan Atha and Rebecca Goodpasture as interns at the Autonomy Incubator. In this post, we ask them to reflect on their experience this fall.


What did you do this semester?

     For my internship I was a computer vision intern in the Autonomy Incubator. My main project was to develop an object detection and classification solution to be utilized onboard a UAV. This entailed researching the state of the art in the field, developing an application based on this research, and then implementing it onboard a vehicle to test. The biggest accomplishment over the course of the internship was being able to run a full flight test of the detection and classification with a UAV. Over the course of the internship I learned many things. The biggest thing I learned was about the state of the art in convolutional neural networks as well as object segmentation approaches with point clouds. Secondly, I learned about the Agile methodology and how to work in a fast pace environment. Third I learned about the challenges still facing UAV’s and what research needs to be completed in the future to create autonomous vehicles. One of the best aspects about this internship was the freedom to develop my own solution for the project and own what I was working on. This allowed me to fully engage in the project and learn a great deal. Furthermore, it has inspired me to pursue graduate school in the field upon completion of my undergraduate degree. As a result of my project, I have been able to present the work twice at the Autonomy Incubator showcase to the Langley community. In addition, my mentor will present my work at the Aviation 2016 conference and we hope to submit a paper to a computer vision or robotics conference. In addition to a great project I have been able to learn about the NASA community as a whole and the many endeavors here at Langley as well as participate in outreach events to further enhance my experience. 

As the internship ends, what are your next steps?

     I will be returning to Purdue for the Spring semester where I will be in my Junior year. I will be rejoining the CAM2 (Continuous Analysis of Many CAMeras) research team that I am a member as well as rejoining an EPICS (Engineering Projects in Community Service) team that works on accessible classroom technologies for the Indiana School for the Blind and Visually Impaired. 


What did you do this semester?

     Coming into the internship, my mentor, Dr. Danette Allen, instructed me to update the blog, tweet, and manage video and photo equipment. From this framework, I developed the following objectives: produce summaries of the Autonomy Incubator’s actives that were interesting and accessible for the average American, increase exposure for the Autonomy Incubator by establishing a robust social media presence across platforms, and to maximize audience interaction with human interest, visual content and trending topics. 

     To accomplish these objectives, I created and worked on a variety of projects. I began a photo series called #MeetOurTeam, consisting of human interest photographs overlaid with brief text, which introduced social media audiences to the unique individuals behind the Autonomy Incubator team. I created an Instagram for the team, developed a Facebook page which is pending review, and worked with an intern in the CIO’s office to develop a NASA Langley webpage for the Autonomy Incubator team. For public relations outreach, I worked with the graphics department to develop a slide publicizing our monthly showcase, I created Autonomy Incubator lapel pins for the team, mini fliers with our social media information for guests and interested parties, and I transitioned the social media content to conform with NASA Style Guide standards. 

     Other milestones of my internship include the lobby video I developed that plays for guests of our building and is also shown at expos and during demonstrations at NASA. The content I created for social media posts received record setting impressions and engagements with our audience, and dramatically increased our followers in the tech field (overall audience increase of over 35%). The largest project I worked on was the Mars Flyer Video, which involved working with a variety of groups on site.

     I learned a great deal during my internship, most notably within the realms of project management, social media marketing, and photo and video editing software. It was a privilege to work with the amazing people at NASA, and the opportunities to participate in outreach events and to organize tours for my peers were invaluable insights into life at Langley Research Center.

As the internship ends, what are your next steps?

     As of tomorrow, I will be traveling around Europe for a month collecting videos, photos and interviews for my travel website When I return home, I will continue to take online and open source courses in engineering, coding and science, as I am considering pursuing a second degree in aerospace engineering. 

Wednesday, November 4, 2015

2015-11-04: Autonomy Incubator Celebrates One Year in B1222

Wednesday, November 4th, marked the one year anniversary of our team’s move to B1222, also known as the Old Reid Center. 

Moving Day! The Autonomy Incubator relocated to the former Reid Conference Center (b1222) today. I still can’t believe it… 
The new furniture is here! The new furniture is here!
Our first week in B1222
Since our move, the Autonomy Incubator has hosted over 35 student interns, given more than 50 tours (to center directors, the NASA Deputy Administrator, OMB, OSTP - just to name a few!), and brought in close to 20 outside speakers to share their work with the Center. The facility itself measure over 70,000 cubic feet, and the forest, mars landscape and residential zone all help our researchers accurately develop autonomous capabilities. 

A young visitor navigates a UAV using Meghan Chandarana's
developments with gesture-based controls.
The large auditorium space enables us to test and demonstrate our algorithms. A fan favorite, Dr. Loc Tran's tree-dodging demo highlights our work with machine learning and with obstacle detection and avoidance.

In the past year, the Autonomy Incubator has worked on a variety of projects. We host student interns who lead their own research, such as Bilal Mehdi and Javier Puig-Navarro's work on coordinated trajectories for small and micro UAS, and we also continuously adapt to reflect research improvements and industry needs. You can read more about our most recent projects here.
A UAV autonomously navigates through our forest.

James Rosenthal and Jim Neilan set up our rover and "Green Machine" UAV for a demo.

An overhead view of our auditorium a year after our move.
From right to left: our residential area, forest, and the Mars landscape.

We are also pleased to announce that the Autonomy Incubator has received a project extension and expansion in name. As the need for autonomous capabilities becomes more apparent, NASA is transitioning our team name from “Autonomy Incubator” to “Langley Autonomy and Robotics Center” although we will never stop incubating new and innovative missions!

An increasing number of projects at NASA Langley Research Center require the development of autonomous capabilities to ensure mission success, and we're looking forward to working with a variety of different teams on site. As we work to maximize our interaction with other groups at Langleyour past month’s showcase became an experiment in internal communications. We received a record number of NASA visitors due to heightened publicity for the event, and we left the showcase with a stronger connection to our audience and with their work.

As we reflect on all that has been accomplished in the past year, we also look towards developing autonomous capabilities to meet the future challenges that NASA will encounter in missions in space, science, and aeronautics. To stay up-to-date with the projects we're working on, follow us on social media.

Twitter: @AutonomyIncub8r
Instagram: @AutonomyIncubator

Tuesday, November 3, 2015

2015-11-03: Autonomy Incubator Showcase

Thank you to everyone who attended our October Showcase at NASA Langley Research Center!

The Autonomy Incubator hosts monthly showcases for our NASA colleagues, during which we report out on how we are rising to the autonomy and robotics challenges in space, science, and aeronautics. These events are snapshots of selected technologies and capabilities that we’re developing as we advance towards our goal of precise and reliable mobility and manipulation in dynamic, unstructured, and data-deprived environments. Check out the time-lapse video of our showcase, below.

We began this month’s showcase by highlighting the work of our fall computer vision intern, Deegan Atha. Deegan, a rising junior at Purdue University, discussed the on-board object classification capabilities he is developing with the help of his mentor, AI engineer Jim Neilan. As Deegan showed our audience the camera he’s been using and how it can be easily added onto different vehicles, he discussed the processing time per image (about 3 seconds) and our work using Euclidean segmentation for detection of objects.

In the coming month, Deegan will be working on in-flight training data, building 3D on a vehicle to optimize accuracy, and ultimately demonstrating the object classification working on a flying vehicle. While we are not ready for him to leave he AI, we are looking forward to his exit presentation!

Deegan Atha, our computer vision intern, discusses his work in object classification.

In the above image, we can see the object classification working, as the
computer recognizes the chair and the NASA employee.

Jim Neilan highlights the different UAV designs in our lab.

Following Deegan, Jim Neilan briefly discussed our vehicles and the importance of our technology's platform portability. Although we might test on specific vehicles, like Herbie, our beloved rover, the autonomous behaviors we are developing are not designed to be vehicle-specific. 

The microphone was then handed to Anna Trujillo, our human factors engineer. At the Autonomy Incubator, we're working on creating autonomous systems that limit the need for constant piloting of a vehicle. However, while we're creating behaviors that can function independently of a pilot we still need a human to define, monitor, and adjust mission specifications. Anna discussed her work developing controls that can be easily understood by a person without piloting experience. She began with the example of a science mission, in which NASA scientists would be using a UAV to test ozone and CO2 levels. Using a display program Anna developed, scientists can use a tablet to get raw data values in real-time, without waiting for post-processing. Information such as vehicle position, time and sensor data can help the scientists ensure the test is running smoothly, and that the data they're getting makes sense. 

Display system created by Anna, designed to enable scientists to gather data in real time.

A package delivery display, in which the human operator can designate the size
of the package and locations of takeoff, drop off and landing.

Anna has also created a user display for package delivery. Using this system, a delivery person can easily input information about the size of the package and then choose separate locations of where the vehicle will takeoff from, drop off a package at, and land.  After the user designates this information, the underlying algorithms determine mission specifics, such as trajectory, without being directed by a pilot. To emphasize the general accessibility of the application, Anna asked a young boy in the audience to input values for the package delivery. He began with inputting  a package code, the first of which was pre-programmed. for a routine delivery. For example, the NASA Langley mail office might deliver the same type of package to our center director at the same time everyday. To show what would happen in the event of a non-routine package delivery, our volunteer created a new code, and then quickly and easily filled in the different data sets by hand. 

Ben Kelley with Herbie, our beloved rover. 

Continuing our discussion of human/vehicle interaction, Ben Kelley, our software architect, showcased his "Follow Me" behavior. In the "Follow Me" demo, our "Green Machine" UAV follows a rover. The behavior occurs when the UAV receives messages of the rover's position and alters its current waypoint (destination) to reflect the location one meter above the rover.

The "Green Machine" in flight, exhibiting the "Follow Me" behavior.

The UAV performs the "tree-dodging" developed by Dr. Loc Tran.

Dr. Loc Tran works on a variety of machine-learning projects at the Autonomy Incubator, and he presented his work with obstacle detection and avoidance. This "tree-dodging" behavior is an important area of research to highlight, as maneuverability and obstacle avoidance is integral to safe UAV flight. "Tree-dodging" begins when Loc gives the UAV the command to take off, with the directive that it is to navigate itself through the trees. Without any intervention from Loc or a pilot, the UAV uses a front-facing camera to detect and avoid certain features, such as branches from one of our artificial trees. If the UAV makes a mistake with trajectory planning, Loc will later correct the vehicle and designate a new rule of response. Through repetition of this process, the vehicle becomes skilled enough to navigate the course perfectly, and ideally, will be able to do the same in a course it has never encountered. One day, "tree-dodging" may help deliver supplies to areas that are difficult to navigate through, such as rainforests, and might be ported to underwater or planetary exploration applications.

Our pilot, Zach Johns, with the Mars Flyer. 

The showcase ended with a demonstration of the Mars Flyer and the autonomous capabilities our team is designing for NASA missions to the red planet. Jim Neilan, our project lead, pointed to a small computer with visual odometry algorithms onboard the Mars Flyer and explained it's ability to reliably navigate the Mars Flyer vehicle in a GPS-denied environment such as Mars. Visual odometry begins with data from the downward-facing camera. Processed onboard and in real-time, calculations of the movement of different ground features enable the computer to accurately create a 3-D map of the terrain.

Zach flies the Mars Flyer, while Dave North (on the right), explains the vehicle design.

At the Autonomy Incubator, we are continuously adapting to reflect research improvements and new NASA projects that require the the development of different autonomous capabilities. To stay up-to-date with these projects, follow us on social media.

Twitter: @AutonomyIncub8r
Instagram: @AutonomyIncubator

Friday, October 30, 2015

2015-10-30: Autonomy Incubator Celebrates Halloween & Welcomes Bubbles

As the Autonomy Incubator celebrates Halloween, we welcome our new Ai mascot and resident trick-or-treat enthusiast. Meet Bubbles, the spider. #MeetOurTeam

While Bubbles spends most of her day hanging out in her web inspiring our team's research in autonomy and robotics, in her off hours she indulges in errant Crazyflies.

Introducing Bubbles, our Autonomy Incubator mascot.

On Tuesday, the Autonomy Incubator team will be entering the Langley Chili Cook-off, where Bubbles and her family will be making their NASA debut. Stay tuned with us on social media for more on this exciting partnership.

Twitter: @AutonomyIncub8r
Instagram: @AutonomyIncubator

Wednesday, October 28, 2015

2015-10-28: Autonomy Incubator Demonstrates Package Delivery

Here at the Autonomy Incubator, we're researching ways to safely deliver packages using small unmanned aerial vehicles (UAVs). Our designs include vehicle components to carefully deliver a package, using computer vision to detect and classify objects, obstacle avoidance, and programs such as our "Follow Me" behavior

The commercial applications for this technology are vast, and provide potential solutions to missions dangerous for humans, such as delivering humanitarian aid or fighting forest fires. However, most widely known is the potential this technology holds for future commercial package delivery by companies. In fact, Flirtey, an Australian commercial delivery service that uses UAVs, stopped by the Autonomy Incubator on Monday to learn more about the work our team is currently doing.

As companies, and NASA, look towards the future of package delivery, safety is a prime concern. Here at the Autonomy Incubator, we're solving these concerns by developing programs for obstacle avoidance and navigation in data-deprived environments. Although known paths and locations will likely be present in a majority of the areas that we would be delivering to on Earth, we will require the capacity to navigate without GPS as a back-up if the signal is lost, as well as for exploratory missions in space and on planets where GPS is not available

A mechanical engineer at the Autonomy Incubator, Joe Lemanski, recently designed a mechanism we have used to carry and release a package from a UAV. As said by Dr. Loc Tran, with this mechanism, "we fully employ the use of gravity"; in other words, we're safely dropping off the package on a designated landing spot.

Danette looks on as visitors hold and discuss the drop mechanism developed by Joe Lemanski.

Meanwhile, Dr. Loc Tran and our fall intern Deegan Atha have been working on projects within the realm of computer vision. At present, we are leveraging computer vision as a means to detect obstacles and to classify objects, such as a destination point for a package delivery.

Representatives from Flirtey watch as Ben Kelley demonstrates safe obstacle avoidance with UAVs.

The "Follow Me" behavior, created by our software architect Ben Kelley, is still relatively new. As we move past these early stages of development, we'll be looking into the possibility of using this behavior with a fleet of vehicles. In a manner similar to a paperboy and the basket on his bike that he reaches into for a paper, the small UAVs could go to and from a moving rover to pick up packages and deliver them to their destinations. As the rover moves, the UAV will employ the "Follow Me" behavior to catch up with the rover, pick up another package, deliver the package, and then employ the "Follow Me" behavior again to repeat the cycle. 

Jim Neilan tests the UAV package delivery mechanism with Herbie, our rover.

Tuesday, October 27, 2015

2015-10-27: Autonomy Incubator begins celebration of NASA Langley Centennial

On October 27th, the Autonomy Incubator team gathered with about 1,000 employees, retirees and family members for an aerial photo outside the hangar at NASA Langley Research Center. This centennial photo marks the beginning of our celebrations commemorating 100 years of research at NASA Langley. 

Credits: NASA/Sandie Gibbs

In 1917, the U.S. government established Langley Research Center, marking it as the oldest of NASA's field centers. Just fourteen years after the success of the Wright Flyer, Langley focused on furthering research in aeronautics and scientific discovery.  As the center continues with centennial festivities, culminating in events on July 17, 2017, we reflect on our center history and look towards another 100 years of cutting edge developments. 

This quote from Neil Armstrong lies outside the front entrance to the Autonomy Incubator. 

Here at the Autonomy Incubator, we're doing the same. Next week marks one year that we've been in our building, the old Reid Center, and over the course of the upcoming year we will be transition in name to the Langley Autonomy and Robotics Center (AR) as NASA extends the scope and duration of our research. As autonomy becomes increasingly vital to the success of science, space and aeronautics missions, Langley is integrating our work with a variety of projects. NASA Langley recently announced that our Autonomy Incubator team will be working on developing "a flying armada of 100 unmanned aerial vehicles", to be exhibited at the centennial event on July 17, 2017.

To stay updated on our progress with the flying armada, and our continuing work in package delivery, computer vision, obstacle avoidance, and localization, follow us on social media.
Twitter: @AutonomyIncub8r
Instagram: @AutonomyIncubator

The Autonomy Incubator team at the centennial photo.

Monday, October 26, 2015

2015-10-26: Autonomy Incubator Demonstrates "Follow Me" Behavior

Recently, the Autonomy Incubator has been demonstrating the "Follow Me" behavior designed by our software architect Ben Kelley. 

The behavior begins with our motion capture system's xyz position data. The interface then converts this data into our room coordinates, and these are in turn converted to relative latitude and longitude (World Coordinates). "Follow Me" is the behavior itself, which occurs when the UAV receives message of the rover's position and alters its current waypoint (destination) to reflect the location one meter above the rover.

Jim Neilan and James Rosenthal set up our rover and "Green Machine" UAV.

The Follow Me behavior runs at 1 Hz, which means that the UAV updates it's waypoint to reflect the current location of the rover every second. In the video below, we see our "Green Machine" UAV following the rover in real time.

2015-10-20: Autonomy Incubator Celebrates Anna Trujillo's 25 Years of Service

For 25 years, Anna Trujillo has been an integral part of the NASA Langley Research Center. As our resident human factors specialist at the Autonomy Incubator, Anna develops operator interfaces and leads our research on the interaction between machine and human.

For 25 years, Anna Trujillo has been an integral part of the NASA Langley Research Center.

Anna began her engineering career as an undergraduate at MIT, where her love of science led her to major in aerospace engineering. The Three Mile Island accident in 1979 inspired Anna to pursue a career to improve the interaction and efficiency between humans and technology, and so after she completed her undergraduate degree, Anna studied human factors at the University of Michigan. Human factors was a relatively new field, and it was initially classified under industrial engineering. Anna also took classes in psychology, and focused on aeronautics with a subspecialty in controls.

Directly out of college, Anna accepted a job offer from NASA Langley Research Center as a human factors engineer. Her first project was a high speed civil transport, the HSCT. A commercial version of supersonic transport (SST), the HSCT was NASA's vision of a supersonic passenger jet that would be able to travel distances in less than half the time of a modern subsonic airplane. 

For about 10 years, Anna worked with Aviation Safety, and completed important research for cockpit safety regarding warning systems and information displays. "I always liked airplanes and the piloting side of what they were doing", she says; "what are they doing, and can what they're doing be improved?" One of Anna's many research papers focuses on her work with displays and alerting systems. What information does the pilot really need? She looked into new ways of showing information, so that someone could quickly understand the issue that was being indicated by the display. The idea, she says, is to give the pilot more time in the event of failure. 

While working on Aviation Safety, Anna also helped with The Unmanned Aircraft Systems Integration in the National Airspace System project, which is based out of the NASA Armstrong Flight Research Center. The project focuses on improving unmanned aircraft systems to safety integrate them into daily life, and during this time Anna became interested in the application of small UAVs as they "change the way you look at piloting a vehicle".

This led Anna to the Autonomy Incubator, where she currently lends her expertise in human factors, controls, and developing effective pilot and operator interfaces. Our team celebrates Anna's impressive 25 years of work, and her ongoing commitment to NASA and the American public.

You can check out Anna's research papers below.
     -A Centralized Display for Mission Monitoring
     -Experience and Grouping Effects when handling Non-Normal Situations
     -Predicting Information: Status or Alert Information? 
     -Using Simulation Speeds to Differentiate Controller Interface Concepts

Wednesday, October 21, 2015

2015-10-21: Autonomy Incubator Hosts Booth at the Hampton Roads Transportation Expo

Carol Castle and Rebecca Goodpasture represented NASA LaRC Autonomy at the Hampton Roads Transportation Expo. Groups across the transportation, infrastructure and construction industries hosted booths and spoke with educators and high school students about career opportunities.

Carol Castle and Rebecca Goodpasture at the NASA LaRC Autonomy Incubator booth.

Along with our representatives from the Autonomy Incubator team, Mike Logan of the Small Unmanned Aerial Vehicle (SUAVE) Lab fielded questions pertaining to mechanical engineering and UAV projects. Mike brought two UAVs that were designed at NASA Langley, as well as a virtual reality headset to demonstrate the "first person" flying experience of UAVs with attached cameras. Most of the kids in attendance expressed an interest in learning to design and develop similar vehicles.

Many students that we talked to were surprised at the wide array of research that is currently underway at NASA Langley, and we touched upon current projects involving air quality and climate change, the development of Lunar Habitatsthe work our team and the SUAVE Lab are doing with autonomy and UAVs, and so on. Correspondingly, there is a range of educational backgrounds and skills that are needed for the overall success of the center as well as on each project. We talked to students interested in chemistry, medicine, engineering, computer vision, psychology, social media, business, atmospheric science, and more.

Students interested in internship opportunities at Langley should check out the NASA student portal (, which offers opportunities for current students in high school, college, as well as recent college graduates. Educators that would like to schedule a visit to their class from a NASA Langley representative can do so through our Langley Research Center Speakers Bureau (

Monday, October 19, 2015

2015-10-08: Autonomy Incubator Team Observes ONR Flight Test

The Autonomy Incubator team traveled to Flying Circus Airfield in Bealeton, VA to observe a day of flight testing for Autonomous Aerial Cargo/Utility System (AACUS).

AACUS is an Innovative Naval Prototype (INP) program sponsored by the Office of Naval Research (ONR) that explores advanced autonomous capabilities for reliable resupply/retrograde and, in the long term, casualty evacuation by an unmanned air vehicle under adverse conditions. AACUS consists of software and sensors that can be applied to a variety of rotary wing aircraft, and will provide the U.S. Marine Corps with the ability to rapidly support forces on the front lines, as an alternative to convoys, manned aircraft or air drops in all weather and possibly hostile conditions, with minimal training required by the requestor.

Wednesday, October 14, 2015

2015-10-06: Autonomy Incubator Welcomes Small Businesses

On October 6th, the Autonomy Incubator welcomed a group of small businesses from southern Virginia. The tour was hosted by the Office of Small Business Programs, which provides small businesses an opportunity to participate in NASA contracts in space exploration, scientific discovery, and aeronautics research.

Representatives from small businesses arrives at the Autonomy Incubator.

The Autonomy Incubator team presented the usual demonstrations, as we highlighted the various applications of autonomous technology. After Ben Kelley showcased our work with UAV obstacle avoidance, which is integral to unmanned package delivery, Jim Neilan discussed his research on data-deprived localization, and the importance it holds to safe navigation on Earth and planetary exploration on Mars. These demonstrations were particularly applicable to the representatives from businesses that are (like us!) involved with autonomy and robotics.

Ben Kelley demonstrates autonomous UAV obstacle avoidance. In the above image,
he mimics a pedestrian on a cell phone. Without any action on his part, the UAV
detects him as an obstacle and maintains a safe distance.

Jim Neilan describes the application of autonomous
capabilities to aircraft headed for Mars. 

It was a pleasure to provide a tour for these local small businesses, and we look forward to future interaction. Thank you for visiting the Autonomy Incubator!

Shielded by a protective barrier, Kyle McQuarry sets up the program that launches the UAVs in flight.

Thursday, October 1, 2015

2015-09-28: Autonomy Incubator Seminar Series, Simon Haykin

Once a month, the Autonomy Incubator invites a speaker from the field of autonomy to share their expertise. Last week, the distinguished guest of our seminar was Dr. Simon Haykin of McMaster University. 

Dr. Simon Haykin
Titled “Risk Control in the Cognitive Dynamic System”, Dr. Haykin’s presentation touched on his efforts to link engineering with cognitive science for the benefit of autonomously intelligent machines. He began by discussing the communication problems in linking the two disciplines, an issue our AI human factors engineer, Anna Trujillo, often touches on. 

The lecture then focused primarily on the concept of controlling risk and mimicking a human's cognitive processing. How do we strengthen and improve an autonomous system? Dr. Haykin proposes the following: supply the system with cognition, bring in control and decision, and ultimately the machine can manage both uncertainty and risk.

Evaluating and overcoming risk is the most difficult cognitive function for a machine to mimic. It requires a similar structure to the feedback channels in humans, as the internal network must process how to act based on observation and evaluation of the environment. The ability of a machine to observe and evaluate thus become integral, to which Dr. Haykin suggests a machine's working memory. Split into two libraries of potential action, one library holds actions that have been taught, while the other holds past experiences. In the face of uncertainty, the latter of the two libraries is more reliable and closer to the world, as the machine can pull on it's experiences in a situation rather than a learned action for the given circumstance.

At the Autonomy Incubator, we design autonomous systems that have the self-contained ability to execute a mission without direct human involvement, and so much of our work falls in accordance with this concept. Similar to Dr. Haykin's ideal system in which the machine recognizes the correct cognitive action to be performed within the environment, our computer vision engineer Dr. Loc Tran is researching machine learning to provide path training and autonomous obstacle avoidance (tree-dodging). 

Thanks again to Dr. Haykin for coming to NASA LaRC!

The Autonomy Incubator team with Dr. Simon Haykin.

Thursday, September 10, 2015

2015-09-09: Autonomy Incubator Launches #MeetOurTeam

As the Autonomy Incubator settles down from weeks of nonstop demonstrations, we are launching a series of short videos called #MeetOurTeam. Designed to introduce our multi-dimensional team, we hope to convey the unique importance of the skills each member brings to the project.

The Autonomy Incubator team consists of researchers from a variety of disciplines. We refer to our lead researches as Principal Investigators (PIs), and they have backgrounds as varied as computer science, robotics, electrical engineering, mechanical engineering, aerospace engineering, psychology, machine vision, and machine learning. As a cohesive unit, our team designs autonomous systems that have the self-contained ability to execute a mission without direct human involvement. The applications for this technology are vast, and so we focus on a wide range of projects within autonomy.

To be successful, we are each of us dependent upon the hard work of our teammates. For example, Anna, our human factors specialist, focuses on the interactions between our autonomous systems and users, and she seeks to make the machine controls intuitive and easy for a user to implement. This has been instrumental in projects designed to be operated by people without a technical background. Meghan's work on gesture-based controls of UAVs falls under this category, and among other things her work has proven exceptional in our outreach as guests to our lab have been able to fly UAVs with a series of simple hand gestures. Loc works on obstacle avoidance and computer vision, which is vital to the research Jim is doing with the Mars Flyer, while Paul's avionics research significantly benefits our more difficult mission and has facilitated our flight tests, and so on and so on. Aside from researchers, our team is supported by qualified UAS pilots, flight safety personnel, range safety officers, technicians and administrators.

Stay tuned for more, and we look forward to having you #MeetOurTeam at the Autonomy Incubator.