Monday, June 24, 2019

2019-06-24: Hey, Javier, What Have You Been Up To?



Javier Puig-Navarro has been a remarkable member of the Autonomy Incubator team since 2014.  It has been wonderful working with him for so many years while he does research for his Ph.D in aerospace engineering at the University of Illinois at Urbana Champaign.

His two main focuses relate to research for search and rescue missions, along with the ATTRACTOR project.

Watch his spotlight interview above to learn more about his work!

Wednesday, June 19, 2019

2019-06-19: Ai's Latest Demo for Multiple Agencies

Several representatives offered some interesting questions and points
relevant to our research.

Some of the most exciting moments here at the Autonomy Incubator include when we get to offer demonstrations to curious individuals and groups. This week, representatives from several different government agencies including NRL and AFRL, visited to take a tour of NASA Langley, and stopped by our branch.

We had the pleasure of showing them our flight room, which is still in the works of being renovated (and we can't wait to start using it again soon)!

Danette Allen showing off our flight room and wire maze.

Danette Allen, Walter Waltz, Sherif Shazly, and Ben Hargis each got to present a little bit about their research, speaking on behalf of their recent accomplishments and goals.

Since our main focus at the Ai includes multi-agent collaborative autonomous assembly, the team discussed what we have accomplished to date with the In-Space Assembly (ISA) project through a sub-scale demonstration.  For this demonstration, we worked with multiple quarter-scale trusses and a single robotic arm.

We do have to do a lot of research via simulation, and I had the pleasure of constructing a few videos so our team members could demonstrate what that looks like.

Danette walking through one of the videos.

Danette passed the baton to Walter, so he could share what he has been working on for the last year.  He described his research with single-agent autonomous assembly, where he focuses on motion planning so that the robot can successfully undertake each step of the assembly process. The four major aspects that his work has an emphasis on are object detection, motion planning, collision avoidance, and trajectory execution.

Walter Waltz has been at the Ai for about a year.

The robotic arm will pick up a truss from a random location and place it in a predefined location.  As it is going through the mission, there is a dynamic sequence of different stages that it will go through, so that he and the other researchers can rigorously test new algorithms, see different planners, and so on.

For these autonomous behaviors, it is extremely important to consider the motion planning and execution. One of the most unique elements about some of the trajectories is that they will observe constraints, which will slightly complicate the planning and execution. 

Of course, all of this development begins in a simulation environment, as previously established. This is so Walter and his research team can get a better idea of the algorithms and how they can form requirements, leading towards validation and the rigorous classification that is necessary for flying to space.

Sherif is one of the robotics software developers on the ISA team.

Following Walter, Sherif took the spotlight to discuss his focus on using 3D point clouds to generate occupancy grids of the collision scene of the robot's workspace. 

The algorithm encodes information about whether the subspace is occupied, unoccupied or unexplored, and by using that information, it creates plans to avoid not only hitting the truss assembly but also any object.  It is very efficient in doing so, which allows us to share the planning scene efficiently as well. This is very important for the ISA researchers because we, as a team, have limited resources, and we do not want to waste what we have and can get.  We are hoping to expand this idea through the inertial transfer concept, which was further detailed by Ben.

This is Ben's second summer with the Ai as a Pathways intern.

Ben defined inertial transfer as "the concept of moving untethered objects through space using the object's own inertia."

Through the use of a concept video from the RAMSES project, he was able to give a detailed description of how it works.  First, the front manipulator arm transfers an instrument panel from storage to the manipulator that is in charge of installing the panel. The efficiency here is gained by free-flying mass, and there are additional structures that allow these manipulators to travel the distance shown in the graphic.  As soon as this object leaves the grasp of the manipulator, it can thus be tracked depending on what metrology strategy is employed.

As a relatively new project, our exploration unveils some questions that need to be answered, such as what kind of infrastructure is best for communication, what information needs to be passed and when, and what are our metrology strategies are- what are we going to measure?

All of these steps are very important in mitigating risks so that we can learn from each consideration. There is always going to be some randomness to the process, which is why it is important to have contingency plans for failed grasps. This can range from anything like awkwardly grasping an object and ending up at an impasse, or a near miss.

A major goal, of course, is to reduce and mitigate as much of this risk as possible.

It was the Ai's pleasure to present another demo and answer any questions that our visitor's asked.  They also made some very useful points for us to consider in the future.

Thank you ISA community, we hope to see you again!

Friday, June 14, 2019

2019-06-14: New VICON Cameras Take Our Robotic Arm to the Next Level

One of the two cameras built above the stage.

We are always more than happy to begin testing new gadgets and technology. With the arrival of six new VICONTM cameras for tracking and motion capture in our sub-scale robot area, Kyle McQuarry and Sherif Shazly have been working diligently to set them up so that we can improve our research and abilities with the beloved robotic arm.

One of the new VICONTM cameras sits in the wall behind the arm.

Kyle McQuarry cleaning up cords from the top right camera.

Along with the VICONTM cameras, we have also gained some new KinectTM sensors.  These are cameras that give you point-cloud images so that you can view a 3D representation of the world.  This is a huge advantage, as described by Sherif, because prior to adding them, they were essentially working off of assumptions and statistically publishing where they thought objects were in space, but now they can be even more precise.

Point-cloud information is already very accurate, but now the world frame would be even more accurate in the correct frame of reference.

These point-could images can also give you octree representations, in which the octomap takes dense information and filters it down into tiny boxes, as you can see below.  The areas that are colored green are mesh and the rainbow boxes are octrees.

The rainbow boxes are octrees.

"The VICONTM cameras haven't been added to the pipeline yet," Sherif explained, "but hopefully we can use them to give the position of the Kinect sensors to get centimeter accurate point-cloud information."

Once the VICONTM cameras begin to be used, they will need to be calibrated every single day.  They use a wand with retro-reflective tracking points lining it and wave it around until calibration is verified.  Then you place the wand in the origin position next to the robotic arm.

Our robotic arm and connect sensors are set up on a stage with the VICON
cameras around it.

Next week, we will be giving a demonstration for Roberta Ewart on how each aspect works together for In-Space Assembly.  It is more than exciting to be able to show it all off.

Friday, June 7, 2019

2019-06-07: It's Summer, and the Interns are Back!

The Excellent Eight

The summer has officially arrived and B1230 is filled with this season's interns.  We are welcoming a few new faces, while also opening our arms back up to some former friends.

Let's meet everyone!

Payton Heyman

First off, I must reintroduce myself.  My name is Payton Heyman and I'm returning for my third internship under Danette Allen!  I just finished my freshman year at the Savannah College of Art and Design where I am majoring in Social Strategy and Management with a minor in Film Production.  I am more than happy to return to NASA to continue creating video and blog content, as well as running the social media platforms: Twitter, Instagram, and Facebook.

Javier Puig-Navarro

Javier Puig-Navarro has been with the Ai since 2014. While working towards his Ph.D at the University of Illinois at Urbana-Champaigne, he has been a very important part of our team.  He is continuing to do research in time critical coordination and path planning for multiple unmanned vehicles.

Jeremy Castagno

After spending a few weeks with us last summer, Jeremy Castagno returned to the Ai about three weeks ago to continue his work!  He received his Bachelor's degree in Chemical Engineering from Brigham Young University and currently is a Ph.D candidate at the University of Michigan, where he has now completed three years and attained his Master's in Robotics.  He will be continuing his work with emergency landings for Unmanned Aerial Systems (UAS), aiming for autonomous machines to be more humanlike in making decisions, especially when it comes to landing and risk assessments. He is working a lot with multi-modal data fusion with modalities including the depth information and the RGB camera information.


Ben Hargis also made his triumphant return to the Ai in mid MayHe is a Ph.D student at Tennessee Technological University, studying mechanical engineering with a focus in robotics.  This summer he is working on a feasibility study for inertial transport, where they will be creating a multi-agent system that can transport objects successfully in a zero gravity environment.

Chase Noren

Chase Noren returned for another internship back in January, and he is staying through July.  In August, he will be starting his Ph.D program at Carnegie Mellon University Robotics Institute.  During his time here, he is assisting a metrology study for future In Space Assembly missions.  The main goal is to better inform our understanding of how autonomous systems will be utilized in space.

Jamie O'Brien

Jamie O'Brien arrived mid May and is a new face to the Ai.  She is a rising junior at the Olin College of Engineering in Needham, Massachusetts, where she is studying electrical and computer engineering.  This summer she will be working with Chase and Ben on the inertial transport project.  She, along with Ben and Chase, are preparing a demonstration to exhibit some of the results of the study.

Katie Clark

Katie Clark is a rising senior at the Massachusetts Institute of Technology in Boston.  She will be spending her first summer at the Ai working with Lisa Le Vie, where she will be studying interactions between humans and robots. Through this study, she will be categorizing people's reactions to the different transparency levels, as well as quantifying minimum trust levels for different tasks.

Sarah Woodham

Sarah Woodham is a rising senior at Virginia Tech, where she is majoring in Mathematics with a minor in Computer Science.  This is her first internship with us and will be working very closely with Natalia Alexandrov and Rob Moreland at NASA Headquarters in DC. Throughout her internship, she will be "cleaning and mashing data sets from the NASA Human Factors Analysis and Classification System (HFACS) and the Safety Culture Survey," which will then "be used by contractors to visualize and analyze the data using Power BI."

All of the interns are excited to take on the summer with new projects and goals in mind! Make sure to stay up to date!