Thursday, July 16, 2015

2015-07-16: Autonomy Incubator Interns Work Together to Crack Coordinated Flight

Bilal, left, tests his controls for the micro UAV as Javier, right, holds its tethers.

Tethered flights like this are a good way of calibrating controls without having to
chase the UAV all over the flight range.
As anyone following the Autonomy Incubator Twitter account already knows, interns Javier Puig Navarro and Bilal Mehdi are constantly performing new, innovative tests in the AI indoor flight range as they make progress towards reliable coordinated flight. From drone hoedowns to UAV tag, both of them have become as famous for their entertaining flight tests as they have for their unshakeable work ethics. Is it not shocking, then, that they have not yet been profiled on the official Autonomy Incubator blog? We thought it was, so today, we're bringing you all the details about everyone's favorite foreign nationals.

The main thrust of Javier and Bilal's research in the AI is coordinated flight, or creating algorithms that let multiple UAVs fly together and communicate to achieve a common mission.

"We do three things: coordination, path following, and trajectory generation," said Javier. "Those are all the ingredients you need for coordinated flight."

"You need to plan what you need to do, then make sure the vehicles follow that plan," Bilal added.

So far, the demo they've used to illustrate the capabilities of coordinated flight is a four-drone mission, where all the vehicles take off, spiral up to a predetermined altitude, then switch formations so that one UAV hovers as the other three orbit around it. This demo, they explain, is a small-scale example of a real atmospheric science data-gathering mission that could be run in the future.  They first rolled it out in our presentation for Deputy Administrator Dava Newman, and have since modified it to include intern Meghan Chandarana's gesture recognition work, like in our presentation for Newport News mayor Mckinley Price.

Remember, this is all autonomous! Incredible, right?

This year is Javier and Bilal's second internship doing coordinated flight research at NASA, but both agree that this summer has brought the most success.

"Last year we started [coordinated flight research], but we never got to fly multiple vehicles," Javier said.

"We had a lot of communication problems last year, but [the AI] has solved them," added Bilal.

One of the secrets to their success is how well they work together, a synergy imported from the lab they share at University of Illinois Urbana-Champagne (UIUC).  Bilal is in his third year of his Mechanical Engineering PhD, while Javier is in his second year of his Aerospace Engineering PhD. Both of them work in the multidisciplinary Advanced Controls Research Lab headed by mathematician Dr. Naira Hovakimyan.  Because of Dr. Hovakimyan's expertise in math, the students in her lab must provide mathematical proof that their algorithms work every time, in all conditions—a rigorous process which guarantees robustness.  According to Bilal, such a level of scrutiny is crucial for implementing autonomous UAVs in real-world situations, especially in his specialty of coordinated collision avoidance.

"If we can guarantee collision avoidance, then we can start having real applications, even high risk ones," he said.  He has a good point— if UAVs are going to be a part of search-and-rescue someday, then they need to perform well every single time they go out. The sooner these algorithms become robust, the sooner UAVs can start helping save human lives.  

Other applications Javier and Bilal anticipate for coordinated autonomous UAVs include scientific missions (like the atmospheric sample collecting in their demo) and Javier's specialty, air traffic control.  Even though both flight technology and air traffic volume have grown massively over the past decades, we still coordinate takeoff and landing for commercial jets the same way that we did in the Reagan administration: with human air traffic controllers giving verbal commands to human pilots. For his Master's thesis, Javier proposed using time coordination to automate the entire landing process for all airplanes coming into a certain airport, from queueing to the individual landings. 

As if being brilliant wasn't enough, both Bilal and Javier are just delightful people. Before he got into aerospace, Javier sailed competitively in his hometown of Valencia, Spain from when he was six years old until he left for the United States at twenty-three. If you ask, he'll drop whatever he's doing to tell you about the regattas he's raced in, or how the freshest fish he ever ate was the tuna he caught in the Mediterranean.  Meanwhile, Bilal has an undergraduate degree in mechatronic engineering—a sci-fi sounding combination of mechanical and electrical—and often swaps lunches with Carol, the Autonomy Incubator administrative assistant. He brings her chicken biriyani; she brings him cheese manicotti.


Wednesday, July 15, 2015

2015-07-15: Autonomy Incubator Welcomes New Team Member Joe Lemanski


Joe Lemanski sits at his command center in the backstage area of the Autonomy Incubator's flight range, surrounded by three computer displays running on two different machines. He takes a sip of his NASA Langley cafeteria coffee (which is surprisingly good stuff) and continues to work as our social media intern snaps his picture again and again.

"I'm researching different autopilots," he says of his work for today. "We need to figure out if an open-source solution would be optimal for us, or some other source."

Joe is the AI's most recent hire; his first day was this past Monday.  He's hardly a newcomer to the world of small UAVs, however.  He graduated from Virginia Tech with a bachelors in Mechanical Engineering in 2009, and while he was there, he worked in the Unmanned Systems Lab under Dr. Kevin Kochersberger.  After graduation, he went on to work and do research for BOSH Global Services, a private company based in Newport News that provides unmanned aerial systems services to both the DoD and customers in other countries. After that job, he spent some time working in IT and infrastructure under an Air Force contract, a time he calls "a little off track" from his specialty but which he believes will prove helpful in the Autonomy Incubator.

"Now, I have a really good grasp of the implementation of things in the real world," he says.

Now that he's officially joined the Autonomy Incubator team, Joe is looking forward to exploring the myriad of different areas his new position lets him work in.

"Here, I'm going to be the hardware guy, but I'm going to wear two hats. I'll also be handling the implementation of some of the software on the hardware," he said. "Software-hardware integration, you could call it."

Joe will also be heading up an assessment of the AI software architecture and code base -- coming up with a "configuration management solution," he says. While such massive undertakings can be daunting, Joe's outlook on the project remains confident and simple: "Let's make a place for stuff and then put the stuff there." A succinct motto, easy to rally behind.

While he works to keep the Autonomy Incubator a smooth-running research machine,  Joe is also working towards a Master's degree in Electrical Engineering from nearby Old Dominion University (ODU).  Formally, his concentration is in signal processing and communication, but he still has plenty of time to decide where he wants to focus his research.

"I want to do video analytics, and machine learning is interesting to me, but I'm not sure yet," he said.


Monday, July 13, 2015

2015-07-13: Autonomy Incubator Gains Two High School Volunteers

Nick, left, and Zach, right

The Autonomy Incubator (AI) and its already booming population of interns are proud to welcome two high school volunteers for the rest of the summer, rising juniors Zach Wusk of Tabb High School and Nick Selig of Norfolk Collegiate School.  Both Zach and Nick have already become a comfortable part of the routine of daily life here in Building 1222, and we look forward to the valuable help and ideas they'll bring to the AI over the next four weeks.

While they'll be working as a team, the two volunteers came to the AI through very different paths—according to Nick, he "got linked with [LaRC AI head] Dr. Allen" through a family friend who works across the street in Building 1268, while Zach ended up here through an even more personal connection: both his parents work at NASA Langley Research Center.  His dad is a pilot for the Airborne Science Data program, and his mom also works in the hangar with the Game-Changing Development group.

Although both express a desire to study engineering in college, both are still learning about different fields before deciding on a specialization.  Nick, for his part, enjoys the more computer-based side of the AI's research.

"I like how they do software—code it and program it and stuff. I think it's really cool," he said. He's also excited to see the UAVs do some "navigation and maneuvering" after hearing about Loc's tree-dodging machine learning research.

Zach, meanwhile, declares a greater interest in the real-world applications of what the AI's vehicles can do.

"I'm interested in drone flight and the different ways it can be used," he said. "I'd really like to see when they're all swarming together," he added, referencing Gil's research on coordinating flight in a flock micro UAVs.  He's seen some of Javier and Bilal's coordinated flight research, he said, and will be interested to see what comes next.

So far, Zach and Nick have tackled their first task together—assembling the golf net the AI ordered for portable demos—with alacrity.  Now, all they have to do is find a way out of it.







Friday, July 10, 2015

2015-07-10: Autonomy Incubator Hatches Flock of Micro UAVs



Robotics engineers at institutions all over the globe have begun using very small UAVs for autonomy research this year, from the Technical University of Munich to MIT to USC (the one in California, not the Gamecocks).  Called either "micro" or "nano" UAVs, these tiny machines can fit in an outstretched hand and can be used to support much of the same research as full-size UAVs.  One Master's student in Germany even used one to do research on one of our specialties, visual odometry.

So popular are micro UAVs in the research community, and so intriguing are their possibilities, that Autonomy Incubator (AI) intern Gil Montague has chosen them for the focus of a new collaboration between himself and the coordinated flight trajectory research of interns Javier Navarro and Bilal Mehdi.

Part of what makes micro UAVs so intetesting to work with is the challenges their minuscule surface area presents.

"How do things change when you don't have as much real estate?" Gil said of the main question driving micro UAV research. "While these are fun, they're being used as real research vehicles."

For their spartan cargo capacity, however, they also bring in a few distinct advantages for coordinated flight research: because they're so small, hundreds of them could fly simultaneously in our indoor operational area. The possibilities are astonishing; imagine a demo with an entire fleet of miniature UAVs maneuvering together in a flock or simulating a fleet of different types of aircraft in our National Airspace System (NAS).

Gil's first goal after integrating the micro UAVs into the AI framework is to design a physical demonstration of a simulation called the Pursuit Domain, in which four "predator" entities try to autonomously find and surround the "prey."

"The hardware platform [of a UAV demonstration] is to show in the physical world the capabilities being developed in the simulated world," he said.  So far, the only demo of that kind is robot soccer.  Peter Stone, a professor at the University of Texas and Gil's "favorite person," gave a TedxYouth talk about how using robots to play soccer allowed him to explore machine learning and "ad hoc teamwork." (Here's his paper, if you'd rather read.)



Just like Dr. Stone used robot soccer to further machine learning research, Gil wants to use a live, physical version of the Pursuit Domain. There are a number of ways to run the Pursuit Domain—letting the predators communicate, making them work independently, adjusting how much the prey can move at one time—which makes it an excellent avenue for research as well as a compelling, entertaining demo.

"You can imagine a demo of these [micro UAVs] trying to corner a ... drone," Gil said.

The AI currently has a fleet of eleven micro UAVs, all Crazyflie 2.0TM drones made by Bitcraze.  In order to both shield the tiny props and provide a place to mount fiducials, intern Josh Eddy designed and 3D printed protective bumpers that can be snapped onto the vehicle body without the use of tools.


Because we have so many of them, Gil also plans to explore flocking behaviors with the micro UAVs.  Think of how starlings can fly in massive clouds together, instinctively staying close to each other while staying away from obstacles and predators. Now, imagine replacing the birds with tiny robots, and replicating the birds' natural behavior through autonomous, agile coordinated flight. Because the Crazyflies are so small, Gil says, we could conceivably implement a flocking demo inside Building 1222, in addition to the Pursuit Domain demo.

Besides carrying on the AI's legacy of showstopping demos, the micro UAVS also show promise in the practical applications we're focused on, specifically scientific research and search-and-rescue  implementations. Gil used the example of a building on fire—instead of immediately sending in firefighters to manually search every single room for people, someone on the squad could release a fleet of ten or so micro UAVs with cameras to do a wide-range primary search, saving time and, potentially, lives. And, because they are so affordable, they can be left behind if the situation becomes too dangerous to collect them.



2015-07-09: Autonomy Incubator Welcomes Newport News Mayor McKinley Price


The Autonomy Incubator (AI) received yet another illustrious visitor this week when the mayor of Newport News, Dr. McKinley Price, arrived early Tuesday morning for a tour and a demonstration. Mayor Price was first elected to office in 2010 after a much-acclaimed career in dental surgery; he is currently in his second term as mayor.

The mayor and his guests were treated to the full scope of the AI's research and projects, from the Mars Electric Flyer to the live autonomous UAV demonstration we originally prepared for NASA Deputy Administrator Dava Newman's visit. Because of the progress we've made on the Mars Flyer's computer vision capabilities, Jim Neilan and Alex Hagiopol were able to take the vehicle right off the display table and use it in Alex's visual odometry demonstration.

Jim explains the Mars Flyer's mission and hardware.

Alex walks the guests through his PTAM algorithm.

During the live UAV demo, the mayor elected to step into the flight range and view the action up close. The package to be delivered this time was not a 3D printed banana, but a container of dental floss bearing the AI logo—a nod to Mayor Price's career as a dentist. The first run through the tree-dodging course was a little difficult—the UAV didn't quite dodge that little tree in the middle — but the team calmly reset as Danette explained what had happened and the second run went smoothly.

The second time through, the UAV gave that same tree a comically wide berth. Machine learning!

Gil #DancesWithDrones for the mayor.

After the package had been delivered and the drones had been danced with, the mayor ducked back through the net to get a demonstration of Meghan Chandarana's gesture recognition software, which she used to initiate Javier Navarro's and Bilal Mehdi's coordinated trajectories flight demo. She used hand motions, a Leap MotionTM sensor and her own computer interface to demonstrate how an operator can use gestures to plan the UAVs' path in sequence (first spiral up, then circle, in this case). Then, she flicked her hand up and all four drones rose into the air to do their routine.

Meghan shows off how intuitive gesture-based controls are.

Within a few seconds of take off, one of the UAVs veered off course and triggered the vehicles' safety stop, which caused them to land. It was an excellent chance for Javier to explain the safety features that the AI builds into all of its autonomous systems while the rest of the team reset the demo behind him.  

Javier gives a compelling talk on autonomous safety behaviors.

Finally, the "demo demon" (as Meghan says) left the AI for good, and the coordinated flight demonstration successfully took to the air, much to the amusement of Mayor Price.

"I have one of these [UAVs] at home that I fly in the backyard," he said. "Now I know where to come for flying lessons."


Thursday, July 9, 2015

2015-07-08: Autonomy Incubator PI Profiles: Loc Tran




Dr. Loc Tran, a recent graduate of Old Dominion University with a PhD in machine learning, is one of the Autonomy Incubator's (AI) newer hires—he's been here since January. In that time, however, he's quickly made himself indispensable as he leads the AI's obstacle avoidance ("tree-dodging") research and serves as the machine learning guru for all the activities that go on here in Building 1222.

If you've followed the AI's demonstrations on Twitter or here on the blog, you'll recognize Loc's work instantly: he's the reason we have an entire artificial forest set up on one half of the flight range. Basically, his work involves using a machine learning algorithm to teach UAVs how to detect and avoid stationary obstacles as they navigate in GPS-denied environments, and since dodging tree trunks under the forest canopy is a compelling example of an environment with no mapping and lots of obstacles, "tree-dodging" came to be the hallmark of Loc's work at the AI.

"I started with about five trees," he said. "Then we put out an ad for old Christmas trees on [the NASA Langley Research Center website] and now..." He gestured toward the small forest filling the far side of the room from his desk. There are about fifteen trees there now, with more arriving every week. Big ones, small ones, spruces, palms; it's a polymer arboretum.

This isn't even all of them; there are some recent deliveries in the back room.
Thanks NASA Langley!

With the capabilities Loc is developing, UAVs could become vital partners to humans in a wide scope of applications, from collecting samples in remote forests for scientists to performing sweeps of wooded areas in search-and-rescue missions. If you're interested in the method of how Loc uses his forest to train the machine learning algorithm on the UAVs, we already asked him about it in a previous post—how convenient is that? Just click here.

Loc explains tree-dodging to Deputy Administrator Dava Newman
and Center Director Dave Bowles during last month's demo

While his research has been both scientifically exciting and entertaining so far, he's looking forward to replacing the safe-but-limited COTS (Commercial Off The Shelf) vehicles he's been using with the Green Machines. With the new, custom-built UAVs' entirely onboard computing capabilities, he says, he'll be able to stage tests and get results that simply are not possible with COTS (Commercial Off The Shelf) vehicles this small.

"I'm anticipating our new vehicle because the problems I'm experiencing with the AR. Drones will... be solved," he said.

While he and the rest of the AI wait eagerly for the Green Machines to be declared research-ready, Loc has turned his attention to a specific facet of obstacle avoidance: computer vision. In contrast to intern Alex Hagiopol's research in visual odometry, which has focused on SVO and PTAM methods, Loc has high hopes for an inertial computer vision algorithm he's been investigating called MSCKF (Multi-State Constraint Kalman Filter).

Rather than relying solely on visual information, MSCKF uses both a camera and an IMU (Inertial Measurement Unit) to collect data about position, acceleration, and rotational velocity. The sensors in smartphones that rotate the display when the phone tilts sideways are IMUs, for example—they sense acceleration and changes in direction. Then, MSCKF employs an algorithm called an extended Kalman filter to filter out the unnecessary "noise" from those two sources and synthesize all the relevant information into an idea of where an object is in space.

When talking about algorithms and concepts that are this complex, it's easy to get tangled up in the jargon and lose sight of just how really, really amazing MSCKF could be. Fortunately, many people doing this kind of research are putting videos of their tests online—the below clip came from Dr. Mingyang Li in California.



"It's a very accurate way of saying, 'This is where you've been and this is where you are now,'" Loc said.  If implemented, then MSCKF would give the Autonomy Incubator's UAVs "unlimited range" and "total independence from the ViCon."

Between teaching UAVs to navigate autonomously and looking for new ways to expand the Autonomy Incubator's research into real-world applications, Dr. Loc Tran's first months at the AI are clearly the beginning of a promising research trajectory.



Wednesday, July 8, 2015

2015-07-07: Autonomy Incubator Celebrates First-Ever Untethered Outdoor Flight



The Autonomy Incubator (AI) and newly-formed UAV Operations group saw its inaugural untethered UAV take to the skies today on the back fields of the NASA Langley Research Center, an accomplishment which represents over a year's effort from dedicated LaRC employees to acquire a Certificate of Authorization (COA) from the FAA.  As the attention surrounding last week's tethered flight test revealed, the Center's proximity to Langley Air Force Base added complexity to the already stringent regulations on unmanned vehicles. This monumental flight, with a Hex Flyer under the control of AI pilot Zak Johns, was the first of many outdoor tests and demonstrations the AI plans to perform with the center's new authorization to fly on site. 

The current COA gives us the ability to fly in the area colloquially referred to as the "back 40," the empty land surrounding the Gantry where the sled for testing the Space Shuttle's tires used to be.  While today's flight took place in only one of the eight sections of the back 40, the AI will eventually be able to use the entire range area for testing as part of a center project called CERTAIN: City Environment for Range Testing of Autonomous Integrated Navigation.



A fourteen-pound Hex Flyer performing maneuvers in close proximity to a large crowd of admirers requires continued diligence from LaRC safety, so all spectators watched from a cordoned-off area about 50 yards away from the launch site and received a thorough briefing from Tommy Jordan before the flight.




While the safety briefing was going on, AI member John Foggia radioed in to Langley Air Force Base's tower to let them know that we were preparing to take off, exercising the newly developed protocol with LAFB for flying unmanned vehicles on site at LaRC.


The Hex Flyer made two six-minute flights, using a predictable circular pattern as it ascended to the top of its range and descended again.  While technically, the outdoor tests might not have had the white-knuckle factor of our more theatrical demos—no one flung themselves into the path of an oncoming UAV, for example— the implications of these two simple exercises are thrilling for the AI. Armed with our new COA at NASA Langley, our ability to test and eventuate the AI capabilities in a realistic outdoor environment will keep us at the forefront of autonomous flight research and will empower our researchers to realize the real-world applications of their work. Today's success created infinite opportunities for success tomorrow.


AI Head Danette Allen explains the equipment to Meghan, Gil, and Bilal

Curious about the flight? Watch this video of the most interesting parts, artfully edited together by the same radiant, hardworking social media intern who recorded it. She even let a grasshopper climb halfway up her leg while she was filming because she was so committed to getting this footage. How could you pass up something so great that it was worth being swarmed by grasshoppers?