Friday, July 31, 2015

2015-07-31: Autonomy Incubator Helps Engineer Extraterrestrial Mouse Habitat

Jeremy Lim introduces the freshly crafted MICEHAB model.

One of NASA Langley's most intriguing summer projects will finally get a demo next week during Autonomy Incubator intern Jeremy Lim's exit presentation.  Jeremy Lim, a rising junior at Penn State University, has spent his summer working with the Multigenerational Independent Colony for Extraterrestrial Habitation, Autonomy, and Behavioral Health (MICEHAB) team, with the goal of integrating autonomy into the robotically-monitored mice habitats.

"The idea is to launch a colony of mice into orbit and observe how they adapt and change with low gravity," Jeremy said. The MICEHAB project is still in its nascent stages, but the plan as it stands right now is to have the MICEHAB capsule in lunar orbit while spinning around a satellite, to which it will be tethered. The combination of the Moon's gravity and the centrifugal force generated from the spinning would mean that the mice inside the capsule would experience about one third of the Earth's gravity.

How the mice fare in space will provide important data for how human bodies would change and react during long space missions, like the journey to Mars.

"Will the younger mice have lower bone density? What about the mice born in low gravity?" he said. However, gravity isn't the only factor that could impact health in space: "The capsule will be radiation shielded, but we still won't have the benefit of Earth's atmosphere," he said.

Jeremy's portion of the project had him working with the hardware aboard MICEHAB, including the robotic arm that will feed the mice and clean their cages in orbit. He used the AI's AEON framework to write his program, a high-level control program designed to let MICEHAB function without human intervention for months, even years at a time.  When little mouse lives are at stake, the autonomous mechanisms keeping them alive need to work well and work every time.

"For long periods of time, it'll have to operate completely independently,"he said. "[My program tells the robot arm] 'Okay, do this task after that task,' or 'do that task because it's this time.'"

Jeremy slots a mouse cage into its shelf.

The robot arm can pull out a cage and lift its lid for cleaning and feeding.

A little demo mouse with a little demo house.

Although his time working on MICEHAB is quickly drawing to an end, Jeremy remains excited about the project's potential. He figures that if a team of students can put this much of it together in just ten weeks, then it should be a testament to how possible a MICEHAB launch could be with a team of NASA engineers working on it.

"We're out here to say, 'This is plausible and and we can do it.'" he said.

For his part, however, Jeremy is pleased with how the summer's efforts have coalesced.

"It is kind of gratifying to see  everything almost together," he said. "We're in the home stretch."





Thursday, July 30, 2015

2015-07-30: Autonomy Incubator Aero Scholar Exit Presentations

Lauren Howell prepares to launch her demo.

Today's the day! Aero Scholars Lauren Howell (University of Alabama) and Michael Esswein (University of Buffalo) revealed their much-anticipated exit presentations to mark the end of their internships with the Autonomy Incubator (AI). In addition to their PowerPoint presentations, Lauren demonstrated her work with Bezier splines by including a flight demo launched with Meghan Chandarana's gesture recognition program, while Michael displayed a selection of the equipment he designed and 3D printed for various vehicles in the AI during his time here.

Michael explains the Odroid mount he created for the Mars Flyer.

Both Lauren and Michael also emphasized how much they had learned through working as part of the AI team. Lauren came into her project with no knowledge of splines or object-oriented programming, and learned them as she went to create her spectacular end result.

"It's amazing how you went from 'What's a spline?' to 'Let me tell you about splines,'" AI head Danette Allen commented to Lauren after her presentation.

Michael, meanwhile, spoke to how much he had learned about Kalman filters in all their various permutations.  To illustrate his enthusiasm, he included a picture of Kal-Man, the mathematical superhero that's been gracing the intern whiteboard since he and Josh Eddy started the project.

"[Before] I'd only heard of unscented Kalman filters, and now..." he said, pausing to roll his eyes as the crowd laughed. "Now I know things!"

Lauren and Michael's final day of work is tomorrow. We're too sad to post an entire article about their imminent departure, so for now, we'll say: Lauren and Michael, you were both incredible gifts to the Autonomy Incubator this summer. Michael, your infinite patience and gentle sense of humor made the intern cave a harmonious place to work (and occasionally goof off), and the diversity of your projects this summer shows how broad your range of talents is.  Lauren, your intellectual tenacity is just one facet of the strong character that makes you someone we are all proud to have on our team, but it also belies a warm heart and a desire to help others that permeates everything you do.  All of us in the AI can't wait to see how brightly both of you will continue to shine.

2015-09-02: Autonomy Incubator Student Exit Presentation - Lauren Howell

On July 30th, the Autonomy Incubator Summer 2015 Aero Scholars delivered their exit presentations documenting and demoing their research results. Lauren Howell, an undergraduate at The University of Alabama, reported on her work in modeling and simulation.

 
Lauren Howell presents "Spline Algorithm Development"

2015-07-29: Autonomy Incubator Puts On A Show for Youth Day

A young crowd looks on in anticipation as one of the kids pilots a UAV with Meghan Chandarana's gesture sensor.

Save for the hum of UAVs and the occasional shout across the flight range, the Autonomy Incubator (AI) is usually a quiet workspace, conducive to deep thought and uninterrupted coding. This was not the case today, as three waves of elementary school-aged children flooded into the AI for demonstrations, stickers, and the chance to fly a drone with gesture control. NASA Langley's annual Youth Day brings children from the Hampton Roads area on-center for a day of tours, and this year, the AI was on the schedule.

The program was short and fast-paced, designed to keep the attention of a room full of children while being easy to reset between the three tour groups. When each bus of visitors arrived, AI Head Danette Allen ushered everyone into the main room and gave a quick safety briefing.

Danette goes over safety precautions and fire exits.

Then, all the children crowded at the front of the net to watch interns Josh Eddy and Gil Montague do a kid-level version of #DancesWithDrones. Josh, playing the "crotchety neighbor" who ordered the UAV delivery, did an old man voice that never failed to get laughs from the kids, and Gil handed out the contents of the delivery package—NASA stickers!—to the enchanted crowd afterward.

Gil explains obstacle avoidance to the largest crowd of the day.

Then, just as our smaller guests were already brimming with excitement, Danette announced the second half of the demo: each of them was going to get the chance to fly a UAV exactly like the one they'd just seen.  Meghan Chandarana, the AI's resident gesture control expert this summer, demonstrated the gestures for take off, front, back, left, right, and land. Then, the kids took turns (by the third demo we learned to make them form a line) trying to navigate the UAV through an obstacle course we created out of Dr. Loc Tran's tree-dodging forest. There was a lot of barreling into trees and cheers of delight, plus some of them were really, really good at navigating the course. Better than the engineers, even.

"You can tell which are the ones that play video games," Meghan remarked.

Our tiniest pilot of the day.

Another UAV pilot in training.

 Not to brag, but there's no other way to say it: we crushed this demo. The kids, faced with flying robots that can think for themselves and that they can magically control with only their hand, were completely enthralled. One pair of brothers even tried to get right back in line after trying Meghan's demo because they were so excited to be UAV pilots. If the goal was to convince the youth of today to become the NASA scientists and engineers of tomorrow, then we can comfortably say that we did more than our part for the effort. 

Tuesday, July 28, 2015

2015-07-28: Autonomy Incubator's Meghan Chandarana Makes Everyone A UAV Pilot

Meghan, right, demonstrates a gesture for Lauren.

In her six short weeks at the Autonomy Incubator (AI) so far, intern and Carnegie Mellon University PhD candidate Meghan Chandarana has opened an new realm of possibilities in the AI's control research through her work with gesture-based controls. Using her program and an infrared hand sensor, any user can use a dictionary of twelve simple gestures to create a trajectory for a UAV to fly, confirm it, then tell the vehicle to take off.  In short, generating paths for autonomous UAVs is currently a task for specialized engineers in robotics labs (like us), but in the very near future, anyone with fine motor skills could be doing this in their backyard with just a laptop and an off-the-shelf sensor.

This week, her work faces its largest challenge yet as her program integrates with Lauren Howell's spline generator and Bilal and Javier's controls system for Lauren's exit demo.  The gestures from the user provide waypoints for Lauren's program, which generates a trajectory spline and feeds it to Javier and Bilal's path-following program, which communicates the path to the UAV.  It's an unprecedented combination of this summer's research; in addition, this demo marks the first time anyone othis than Meghan has used her gesture recognition program in the flight range. After practice with Meghan, Lauren will be the one to set the path and take off the vehicle on Thursday. 

"The system is not trained on any one user. The fact that Lauren can come in and teach a trajectory and have the system understand it is awesome," Meghan said.

Meghan and Lauren beam proudly during the maiden voyage of their demo.

While her existing program is already compelling, Meghan has even more sophisticated things planned for the future. Her next step, she says, will be to allow the user more control over the trajectory they construct.

"Right now, it's a pre-set length for every path. Eventually, you would first do a gesture, and then the system would say, 'Hey, you did a spiral, how long do you want that to be?' Then you'd use another gesture to specify the length or the radius of a circle," she said.

Beyond the immediate, Meghan's ultimate goal for the project is to incorporate neural nets and integrate machine learning into her program, then train it on a variety of users.  The more versions of the "down" gesture it sees, for example, the more nuanced its understanding of "down" will become, and the more robust its functionality will be. 

"We want multiple people to come in and teach the system, 'What is a spiral?'" she said.

Meghan's excitement and dedication to her project is contagious; the entire AI comes out to see her tests whenever she flies. Other interns have even started writing programs for the sensor she uses; intern Gil Montague wrote a controller that uses the angle of the user's outstretched hand to maneuver a UAV around. It's so easy to use that we're setting up for the kids at NASA Langley's Youth Day to play with tomorrow.

Meghan demonstrates another kid-friendly aspect of gesture control, which is that it frees up your other hand to hold a freezie pop. Nick Woodward, enraptured, looks on.







Monday, July 27, 2015

2015-07-27: Autonomy Incubator Intern Duo Tackles Kalman Filtering


Josh Eddy and Mike Esswein: the few and the proud
 This week marks the penultimate week for most Autonomy Incubator (AI) interns, and the last five days of work for Michael Esswein and Lauren Howell, the Autonomy Incubator's two Aero Scholars. On Thursday, both will present their research before parting ways to opposite ends of the country—Lauren to Alabama, Michael to the University of Buffalo.  We covered Lauren's incredible achievements for the summer last week, but what about Michael? Where has he been for the past nine weeks?

Michael ("Mike" to his friends, which are everyone) has been working on a team with fellow intern and soon-to-be Master's student Josh Eddy, investigating sensor fusion using a genre of algorithms called Kalman filters.  Remember those from Loc's work? They filter out noise from sensor data, and can also fuse data from different sensors (hence, sensor fusion) into one estimate of where the vehicle is in space. Loc, as we remember, is researching applications for extended Kalman filters (EKF) in sensor fusion for autonomous machines, while PI Jim Neilan is doing similar research with a more recent iteration called an unscented Kalman filter (UKF). For those curious, the UKF takes its name for a mathematical concept it uses called the "unscented transform," which in turn was named after a stick of unscented deodorant that its creator spotted on his labmate's desk. Science is amazing.

Josh and Mike's mission this summer has been to support Loc and Jim's research; they've spent the past two months reading jargon-riddled papers and teaching themselves out of textbooks from AI Head Danette Allen's office library—which, conveniently, is full of Kalman filtering books that were carefully read and annotated for her PhD thesis. It's been a Herculean endeavor that's kept both of them in the AI's intern cave (the colloquial term for the warm, dimly-lit room where the interns collaborate) for long hours each week. To keep morale up, they even sectioned off part of their whiteboard for the public to leave their Kalman-related jokes:

Josh explains Kalman filters as King Tuten-Kalman looks on.
This is the part of the blog where we explain how Kalman filters work, which is going to be difficult because there's math involved and the person who writes our blogs is an English major at a liberal arts college. Luckily for all of us, Josh and Mike are here to help everyone understand Kalman filters.

"First, read this," Josh said, flopping a massive C++ textbook onto the table.

"Then this," Mike said, holding up a Kalman filter book from the pile on his desk.

"Then go to grad school," Josh concurred. Finally, with a little more plying, they agreed to at least attempt to explain Kalman filtering to the American public.

Mike's book of the month.

"What Kalman filtering is, is a form of state estimation. We're answering a question, and that question is, 'What is the state of our system right now?'" Josh said. "It's an extremely powerful method of estimating the position of a vehicle." Basically, it provides a snapshot of what the vehicle is doing at a point in time, compares it to previous snapshots, and calculates where the vehicle is in space based on the differences.

Mike, meanwhile, emphasized the ability of the Kalman filter to seemingly create order in a chaotic universe of data.

"One of the things you have is random noise or drift, and one of the cool things about that is even though it's random, it always forms a bell curve," Mike said. That's because the filter uses normally distributed (and white) noise to account for uncertainty in its estimates.

Interestingly, both of them have dealt with Kalman filtering before: Josh at his internship at the National Institute of Aeronautics last year, and Mike through his his work on the University of Buffalo's joint satellite project with the Air Force and NASA.

"Once it's on your resume, you become the Kalman filter guy wherever you go," Mike said, to a knowing laugh from Josh.

Friday, July 24, 2015

2015-07-24: Autonomy Incubator Demos for Aerospace Safety Advisory Board

AI Head Dr. Danette Allen and John Foggia greet the panel.

The Autonomy Incubator (AI) was honored to be asked to present for the NASA Aviation Safety Advisory Panel (ASAP) yesterday, the latest demonstration in a summer's worth of high-profile guests to Building 1222.  In accordance with ASAP's mission, the AI decided to forgo its usual demo in favor of a more concentrated program that focused on the safety stops and measures engineered into everything we do.

When the ASAP contingent arrived, they were greeted at the door by AI Head Dr. Danette Allen and AI member John Foggia, our resident flight safety official.  John gave them a thorough safety briefing once everyone gathered in the main room—stay back from the net while UAVs are in the air, wear safety glasses if you enter the flight range, fire exits are behind you if anything bursts into flames—before Danette took over to explain the AI's mission and research.

Danette fields questions about the AI's stable of UAVs.

Once our guests were acquainted with the Autonomy Incubator and the amazing flying robots therein, John took over again to introduce the day's exercises. We began with an updated version of #DancesWithDrones led by interns Josh Eddy and Gil Montague. Although it's a familiar joke around here, the title elicited a hearty chuckle from the crowd.

"It's like baseball, keep your eye on the ball. We don't want anyone catching a foul ball today." -John, on UAV demo safety

Gil began the AI's signature demo as usual, explaining how the UAV senses his presence and avoids him as he sidestepped in and out of the vehicle's path. Then, Josh donned his own fiducial-dotted helmet and joined Gil in the GPS-emulation area, where they both jumped in and out of the UAV's planned flight path as it replanned around them in real time.  Below, you can see the live video of the obstacle avoidance program calculating new routes every time Gil or Josh (the two red circles) moved.


The AI's two-man theater troupe takes the stage once again.

Then, intern Javier Puig Navarro stepped up to the net to introduce his and Bilal Mehdi's multiagent coordinated flight demo. In this scenario, he explained, a team of scientists has programmed four UAVs to spiral around and collect atmospheric samples, but several unexpected things happen to excite the vehicles' safety stops.

Javier elaborates that whenever the UAVs encounter an unsafe condition, they land.

Taking the proverbial baton from Javier, intern Meghan Chandarana explained how her gesture recognition program had been integrated into the vehicles' path-following program to allow anyone to control multi-vehicle teams of UAVs. With an upward flick of her hand, the quadrotors rose into the air.

Meghan readies for takeoff.

The UAVs started ascending in sync and began their atmospheric sample-collecting spiral path, but—what's this? A NASA employee didn't know he'd been replaced as a sample collector! He's right in the middle of the vehicles' spiraling path!

He's so confused, he's even trying to take air samples with a multimeter.

The UAVs stopped and hovered in place until Bilal shooed Gil out of the arena, when they resumed flying. Disgruntled at his sudden lack of employment, Gil snatched one of the drones out of the air and sprinted away with it out of the GPS-emulation area. The remaining three drones landed upon losing contact with their team member that had exceeded the geocontainment boundary, much to the awe of the ASAP spectators.

Gil absconds with a UAV.

This demo was a fun challenge to meet, because we got the chance to showcase all of the hard work that we put into safety here. It was a chance to take the things that usually never see daylight and put them in a demonstration, just to drive home how safely autonomous machines and humans can interact. We're excited that ASAP is interested in UAV safety, because we are too.








Thursday, July 23, 2015

2015-07-23: Autonomy Incubator Intern Hits Bezier Curve Breakthrough

Lauren hits "take off" on her first AI test flight.

After nine week's worth of rigorous code reviews and teaching herself advanced mathematical concepts from academic papers, Autonomy Incubator (AI) intern Lauren Howell saw her work generating splines and Bezier curves for path-following algorithms take tangible form in the flight range yesterday. Lauren, an Aero Scholar from the University of Alabama's aeronautical engineering program, has shown an incredible dedication to her project throughout the summer—she learned both C++ and much of the high-level math required for her project as she went, and now has a robust program and an encyclopedic knowledge of splines to show for it.

Specifically, Lauren's program generates the control points to make UAVs using waypoint path-following navigation fly in smoothly curving, accurate paths. Joining waypoints by flying in straight lines causes overshooting problems, she said, as most vehicles can't make sharp turns in the air.  Think of it like a racecar skidding sideways out of control after taking a turn too hard; it's neither efficient nor safe. Out UAVs need to find and hold their Formula One line through each change in direction, and that's exactly what Lauren's software empowers them to do.

"You want to be able to connect [the waypoints] in a way that adheres to the dynamics of the vehicle," she explained.  By generating control points in gentle paths to connect the waypoints, the UAV can hit each waypoint while avoiding overshooting.

To generate her control points, Lauren used Bezier curves over each waypoint, joined by splines to create a "smooth, continuous trajectory."  Just look at how silky smooth the path that she generated for yesterday's test flight is:

The UAV takes off, flies a square path, then spirals up.

"The test flight was to show that my code will take in a given set of waypoints that you want the vehicle to fly through, and will use the algorithm in my C++ program to generate the control points that are implemented into the Bezier curve equation such that we get the desired flight path," she further explained.

So, how did her first flight go? Marvelously; just take a look at the video below. Notice how the entire intern population of the AI seems to be gathered around to partake in the glory.



After her initial success, Lauren analyzed the results from the flight and ran two more flight tests under different conditions to gather more data.

The usual suspects gather around: Meghan, Bilal, Lauren, and Javier.

Lauren, pensive, looks over her results.

The second flight test in action.

For her exit presentation next week, Lauren plans to make her demo a showstopper by including other work from around the AI.

"We want to integrate Meghan's gesture recognition software," she said. Lauren plans to use fellow intern Meghan Chandarana's program, which allows users to interact with UAVs through gestures that mean things such as "take off" and "move forward," to feed waypoint data directly into her curve generation program and create a flight path. Although no one in the AI is ready to say goodbye to Lauren, we have to admit that we can't wait to see what kind of engineering magic her demo will bring.






Wednesday, July 22, 2015

2015-07-22: Autonomy Incubator Seminar Series: Dr. Max Versace and the Neurala Team

Dr. Versace stands in front of Neurala's robot to demonstrate its obstacle avoidance capabilities.

A hockey puck-shaped robot, carrying a laptop and a KinectTM on a set of shelves, was the star of the Autonomy Incubator (AI) today as it trundled around a makeshift enclosure in the AI flight range. CEO Dr. Massimiliano "Max" Versace, Dr. Matt Luciw, and and doctoral candidate Jeremy Wurbs represented robotics company Neurala, a NASA Phase II SBIR recipient, on their visit to NASA Langley Research Center. They came not only to deliver a lecture, but also demonstrate their navigational and collision avoidance research, as well as the soon-to-be-released Apple iOS app that has grown out of that research.

Neurala, as its name would suggest, uses concepts from neurobiology in its machine learning research. Specifically, they model their object recognition and their mapping software--two major components-- on the way the mammalian brain processes visual information.

"One main inspiration on the high level is the 'where' and the 'what' pathways in the brain," said Dr. Luciw during his talk. By using software to mimic the way human brains can recognize individual objects and then place those objects in a mental 3D map of their surroundings, Neurala intends to create robots that can position themselves in maps of their own creation, based only on monocular video and IMU (Inertial Measurement Unit) data - what Dr. Versace calls "passive measurements." Remember Loc's research on autonomous navigation in GPS-denied environments? Neurala's robot works somewhat like that, but with more of an emphasis on object recognition and identification, or classification.

Inside the enclosure, the robot detects and identifies a dog. The walls are covered in pictures to give the algorithm plenty of features to latch onto. (Yes, that's a photo of Mark Motter.)

Dr. Luciw pulls up several displays to show what data the robot is taking in.

"What we're building is a way for everybody to make a robot operate hands-free," said Dr. Versace. "The goal is for users to tell robots what to do, but not how to do it."

Neurala started in 2006 as a project in a Boston University business class, and in the nine years since, it has gained massive momentum in the deep learning community. The AI's Mark Motter has worked with Dr. Versace and his team since 2011, when their first NASA award brought them on board to do UAV collision avoidance research.  Now, the applications Neurala wants for its research range from toy robots controlled by their app, to industrial robots made safer through collision avoidance, to telepresence robots able to gracefully navigate through their surroundings.

"Another application we are working on is inspection," Dr. Versace said. Small UAVs can scan pipes or structures for defects, holes, and rust, then report back with its findings. "It can come back and say, hey, out of this five hours of video, I found three things I think you should look at."

Now, about this iOS app we keep mentioning: it's called Roboscope and it can pair your device with your ground or air toy robot to let you choose a "target" from the scene around you on your screen—say, your friend's backpack—then have the robot autonomously follow him, do a "sneak attack" where it swoops by/jumps out at him, and more. You can even "teach" the app to recognize faces and objects and then enter them into its memory, so that whenever the device camera sees that person/thing, its name will pop up on the display.

Dr. Versace teaches Roboscope to recognize Mark Motter.

However, don't be fooled by all the fun Neurala is having: they're doing some serious autonomy research with fixed-wing UAVs as well. As Jeremy Wurbs elaborated during his part of the seminar today, his research on collision avoidance aims to meet the FAA's mandate that a UAS be able to fly "at least as well as a human pilot." Human vision, he explained, uses both peripheral vision and foveal (focused) vision systems to take in visual data, and he's implemented a similar approach in his collision detection algorithm.  The peripheral scans for any approaching aircraft, then the foveal vision locks in on the traffic and uses optic flow to determine if the UAS is on a collision course with it.

"We're looking for radially expanding flow fields," he said.  Essentially, if something is expanding in the UAS's field of view, then the algorithm knows that it's approaching.  If that something's radius is within the area of the UAS, then the algorithm knows it's on a collision course and acts evasively.

Jeremy walks the crowd through some example simulations.

AI interns Meghan, Josh, Nick and Nick are feeling this lecture.

Want to see Jeremy Wurbs' algorithm in action? Neurala conducted real, live test flights with fixed-wing UAS's in restricted airspace, and they were kind enough to upload footage from one of the cameras. Look at how quickly the algorithm picks up on the other vehicle when it enters from the left.




Tuesday, July 21, 2015

2015-07-21: Autonomy Incubator Flies Coordinated Micro UAVs

AI blog regulars Bilal, Gil, and Javier celebrate with their micro UAVs.

Just two weeks after the micro UAVs landed in the Autonomy Incubator (AI), the controls research of interns Javier Puig Navarro and Bilal Mehdi has merged with intern Gil Montague's communications research to make path-following and coordinated flight with these tiny vehicles a reality.  Three micro UAVs took off, followed a circular ascending path, and landed in unison.

Trouble finding the little guys? Look at the trees.

The breakthrough came at the very end of the day, after hours of fiddling with fiducials (the reflective silver alloy dots that make the vehicles trackable in our indoor flight area) and carefully tweaking variables finally paid off. Before the AI attempts any multivehicle flight, each UAV must prove its individual robustness-- a time consuming process on occasion (such as today), but one imperative to both safety and accurate research.

While the path they followed was simple, today's flight sets an important precedent for future AI research with micro UAVs.

"We got these vehicles going and now we can test them," Gil said.

"We've done three, now we can do four as well, which won't be that big of a leap," Bilal added.

With multiple micro UAV flight proven successful in the AI flight range, the team is one step closer to simulating flocking behaviors and other awesome demonstrations. Watch the magic happen below:








2015-07-20: Autonomy Incubator Assists in First FAA-Approved UAV Delivery

The SR22. Image credit NASA.

Reaching remote locales in need of supplies is a common goal across the unmanned and autonomy research communities. It was with this challenge in mind that the Let's Fly Wisely program came into being: using UAVs to deliver prescription medication to the Remote Area Clinic in Wise County, Virginia.

Sure, the massive media attention surrounding this first FAA-approved unmanned package delivery in the National Air Space (NAS) can only mean good things for the Autonomy Incubator (AI) and its mission of autonomous flight, but we have an even closer connection: AI member John Foggia, the project manager for the CERTAIN (City Environment for Range Testing of Autonomous Integrated Navigation) program and a driving force behind our recent untethered flight, served as the Safety Officer for Fly Wisely. Even more monumental, he secured a role for NASA Langley's SR22 UAS surrogate as the fixed-wing aircraft that flew the medication into the Wise, VA airport to be transferred onto a hexrotor UAV.

That's a lot of information, so let's back up a bit: the NASA Langley SR22 is a one-of-a-kind airplane, built on-center by a team of which John was a part.  It's what he calls a "surrogate UAS," a vehicle that was not built to be autonomous but was retrofitted that way. Everything—pitch, roll, yaw, attitude—is controllable from the ground. They even built a robotic arm to work the throttle and give remote speed control, and it's accurate to within one knot.

"Better than a human can do it. Well, other than me," John joked.

What makes the SR22 even more unique is that it's not technically unmanned: a "safety pilot" sits in the cockpit and lets the airplane have the controls, ready to hit the manual override button and take over if need be (which rarely happens). In addition to the pilot, there's also another person in the back of the plane with a radio headset and a computer. In the event that the ground station loses contact with the airplane, the onboard researcher can take over controls with his or her computer and continue flying the mission.  Because it's a manned aircraft, the SR22 can fly in the NAS, which was crucial for Fly Wisely.

The Let's Fly Wisely project is the brainchild of several organizations, including Remote Aerial Medical (RAM), the Mid-Atlantic Aviation Partnership (MAAP), the charity Health Wagon, Virginia Tech, and Australian UAV delivery company Flirtey.  John, representing NASA LaRC, was originally brought on board in April to be the safety officer.  However, when the fixed-wing craft that was supposed to fly the mission suddenly became unavailable, John saw an opportunity to show what NASA Langley can do.

The SR22 had flown LD-CAP, another big research mission off-center in North Dakota some years ago, John said, so he had complete faith that the SR22 would be both good for Fly Wisely and good for NASA. "[The North Dakota flight] was a huge success, but very quiet to the media. This [Fly Wisely mission] wasn't quiet."

The efforts to which John and the rest of NASA Langley went in order to make the SR22 a part of Fly Wisely are simply extraordinary, given the incredibly tight turnaround for all the paperwork that had to go into such a high-profile and groundbreaking endeavor.  Space Act Agreements, COAs, every conceivable bureaucratic hurdle. But, they did it, and as John said, "our part went off flawlessly."

A crowd of hundreds met the SR22 as it cruised into the Lonesome Pine Airport, watching it circle overhead before it landed to deliver its package of medication.  As the spectators swarmed the airplane, John controlled the crowd and took the package outside, where a doctor took control of it and hooked it onto the Flirtey hexrotor's delivery apparatus—a specially-designed cable winch under the vehicle. Finally, the UAV took to the skies.



The hexrotor used GPS waypoints to navigate the 20-mile journey to the Wise County Fairgrounds, where the one-day RAM Remote Area Clinic was underway.  According to John, there were people camped out the night before just to make sure that they received care at the first-come, first-serve clinic.

"Once they're out of stuff, they're out of stuff," he said. That's where autonomous UAV supply delivery comes in. In a future where UAV delivery becomes ubiquitous, clinics like RAM's will be able to serve more people. No longer will medical staff be limited to only what they can carry on site when autonomous UAVs can deliver more of whatever they need.

That's the future John sees for flight: autonomous air vehicles taking over for human pilots, both in airplanes and in small UAVs, and NASA Langley remaining crucial to making that change possible.

"We're trying to promote trust in autonomy, so we're doing in situ research that people will believe. ...  We need to convince the public that autonomy is the safest way forward," he said. As autonomy research continues, "Langley has a set of competencies, capabilities, and access that no one else has."










Monday, July 20, 2015

2017-07-18: Autonomy Incubator Summer Picnic



The summer has flown by at an unbelievable pace at the Autonomy Incubator (AI). The next weeks will see the departure of most of the AI interns as their ten-week tenures come to a close and they scatter across the country again.  To celebrate their contributions and send them back into the world with a fanfare, AI Head Danette Allen and AI secretary Carol Castle organized a potluck picnic in the park next to the office.  Gracious host Danette provided massive quantities of fried chicken and a craft beer station for those old enough to enjoy, while everyone else brought a dessert or side dish.

The AI power table tucks in.

NIA interns Nicole and Curt (right) joined us as well.

Danette and Carol bask in their party-planning glory.

More guests! This was the hottest party of the summer!

Jim Neilan brought his son, James, who is objectively the cutest baby in the universe.

After everyone finished eating, the interns (and some intrepid PIs) took to the park and the accompanying playground to frolic. Baby James made an appearance on the playground too; the only member of our party for whom any of the equipment was designed. That didn't stop most of us from squeezing our ungainly adult bodies into the slides and onto the rocking horses, though!

Bilal and Loc have a jumping contest.

Loc and Gil... 
Ben Kelley...

Javier... 

Abbey and Meghan...

... and Baby James try out the bouncy space shuttles.

 One of these things is not like the others.

"I'm only a little stuck." -Loc

"I can't remember the last time I was upside down!" -Bilal

Meghan surveys the Center from above.

While half of the crew gamboled about like kindergartners on the playground, the other interns played lacrosse with the toy sticks and balls that Carol provided.

Josh Eddy, Nick Woodward, and a MARTI guest play lacrosse.

High school volunteer Nick Selig goes for a catch.

Nick (the other Nick) loosens the pocket on his stick.

Mike, Jeremy, Nick and Nick take a break from playing.

We can't believe that it's already almost time to say goodbye; the Autonomy Incubator won't feel the same without this stellar crop of interns around. The coming weeks will be bittersweet, to be sure, but we'll enjoy all of our time together as we spend our last days working and playing as a team.