Friday, June 26, 2015

2015-06-26: Autonomy Incubator Welcomes New Intern Meghan Chandarana


The already robust population of interns at the Autonomy Incubator (AI) gained a new member this week with the arrival of Meghan Chandarana, who will be working Anna Trujillo, the PI for HSI in the AI. Meghan is a PhD candidate in Mechanical Engineering at Carnegie Mellon University.

Meghan hails from Stockton, California, and completed her undergraduate studies in engineering at UC Berkley. She is also a veteran NASA intern; she spent last summer at Marshall Space FlightCenter (MSFC) in Alabama.  Her work there, as well as her work on her thesis, reflects her specialty: controlling robots with gestures.

Yes, really— Meghan's work makes it possible for anyone to control a robot simply by holding his or her hand over a 3D sensor. For her project last year at Marshall, she programmed a robotic arm mounted to a CubeSat to mimic hand movements. Just look at this amazing demonstration she recorded to show her CMU lab mates:


It's like something out of a movie, right?  But Meghan's just getting started—here at the AI, she wants to use the same techniques to teach UAVs new maneuvers.

"Let's say a quadcopter needs to turn sideways to fly through a gap between two trees," she said. "Instead of coding for hours, you could just show it the maneuver with your hand."

She's already got her 3D sensor up and running on her desktop, along with a demo that uses hand gestures to move a cube in three dimensional space on the screen. Her goal for this summer's project is to create a system of human-machine communication that's not only accurate, but natural to use.

"[Gesture-based controls] should be as effortless and intuitive as the way I'm talking to you now," she said: no sensor-laden gloves or specialized training needed.

After such an impressive first week and with such ambitious goals for the future, we can't wait to see what kind of exciting contributions Meghan brings to the Autonomy Incubator.




2015-06-22: Autonomy Incubator Triumphs in Demo For Deputy Administrator


After last week's flurry of high-profile visitors to the Autonomy Incubator (AI), our witty and beautiful social media intern has spent every spare second compiling material specifically for today's post: the recap of the AI's visit from Deputy Administrator Dava Newman. You followed our preparations, you saw the live tweets, and now, at last, the day has come to experience our demonstration for Deputy Administrator Newman in full multimedia glory.

The Deputy Administrator, accompanied by Center Director David Bowles, arrived at Building 1222 at 4:30pm on Thursday, and was greeted by a crowd of Autonomy Incubator team members, representatives from NASA Langley's administration, and scientists from other related programs at Langley. Dr. Danette Allen, the Head of the Autonomy Incubator, gave an introductory speech focusing on how unique the AI is in both its research h thrust and its use of agile development... in her words, "doing different work, but also working differently."

Center Director Bowles provides some background  information on the Autonomy Incubator to Dr. Dava Newman
Danette answers a question from the Deputy Administrator

After the introduction to the AI, Jim Neilan conducted a showcase of the technology and vehicles that the AI uses in its pursuit of autonomous flight, such as lidar and the Mars Electric Flyer.  Danette used the opportunity to get a picture with everyone in the lidar display.

Jim explains the AI's involvement in the Mars Flyer project
Danette strikes a disco pose as Garry takes a picture

All this talk of robots and vision provided a perfect segue into Alex Hagiopol's presentation, in which he gave a live demonstration of the PTAM (Parallel Tracking And Mapping) visual odometry algorithm that he recently finished testing.

Alex gives a brief overview of visual odometry

PTAM in action

Finally, Anna Trujillo used visual odometry as a springboard into her topic, human machine teaming, by explaining how a UAV that can "see" features and avoid them autonomously could be implemented in a suburban setting to deliver packages. She even clicked through a prototype user interface that a delivery service employee could use to set up a delivery, including choosing where around the house the UAV drops the package (the front porch or the side door, for example). Annaalso described the Autonomy Incubator's intention to perform an outdoor, full-scale demo here on center within the next year, which was met with much excitement from the crowd.

Anna points out the area of the center we'll be using for on-center outdoor package delivery

At last: the hour of flying robots was at hand. The Deputy Administrator and the Center Director donned safety glasses and stepped into the flying range, the net separating the viewing area from the demonstration was drawn back, and Dr. Loc Tran introduced the live demonstration: the UAV's task was to take off, navigate through the forest into the neighborhood and deliver its banana package to the porch of the red house despite any obstacles it may encounter (#DancesWithDrones).  As a tongue-in-cheek surprise, the interns incorporated a little bit of theater into the demonstration— Josh became the inhabitant of the red house who had ordered a banana (for scale!), and Gil played his clumsy friend who kept wandering into the path of the UAV.

Loc introduces the live UAV demo

Gil dances with drones as Josh looks on from the safety of his home

Gil explains the UAV's safety stops

Once Gil stopped "dancing", Javier stepped in to introduce his and Bilal's coordinated flight demo while other AI team members scrambled behind him to replace the UAV with a fleet of four vehicles, all wearing their new NASA cowling decals. This demonstration, Javier explained, was actually a small-scale model of a mission they had designed for LaRC scientists to collect data with vehicle-mounted ozone sensors. The four UAVs took off in unison and flew in a slow, ascending spiral, then one flew into the middle of operational area and the other three orbited it for a while before they all landed simultaneously.

Javier provides some insight into the algorithms behind the flight

Nick and Bilal look on determinedly as the UAVs ascend

Bilal steps away from the computer and narrates the flight, emphasizing that these vehicles are flying by themselves
With that, the live demonstration came to an end, and everyone moved to the other side of the net for a look at the GL-10 aircraft.  For those who aren't familiar, the GL-10 is an aircraft capable of taking off and landing vertically, yet also able to transition into forward, fixed-wing style flight.  It's not strictly an Autonomy Incubator project, but our team has been involved since its beginnings— plus, it's really, really cool. The GL-10 team even brought an Oculus Rift VR headset that played a 360-degree flight video, which the Deputy Administrator used to take a virtual flight on the aircraft.

Bill Fredericks talks the Deputy Administrator and the Center Director through the GL-10's innovative design

The Deputy Administrator takes flight on the virtual GL-10

Finally, having seen all the many wonders of the Autonomy Incubator, Deputy Administrator Dava Newman bid the crowd farewell, and expressed her anticipation of our outdoor tests. "I'll have to put in my banana order!" she said.  

Thursday, June 25, 2015

2015-06-25: Autonomy Incubator Tests Visual Odometry on Mars Flyer




Today, Autonomy Incubator (AI) student intern Alex Hagiopol and UAV pilot Zak Johns conducted the first tests ever of the visual odometry algorithms onboard the Mars Flyer. The test, and the ones to follow it, are an exciting extension of Alex's visual odometry research, which had previously been limited to him manually moving a camera over a scene of features.  The ultimate goal of integrating visual odometry onto the Mars Flyer is to create a reliable system of navigation for the vehicle in the GPS-denied environment of Mars.

The Mars Flyer is unique for a number of reasons, but today's tests highlighted just how packed with technology this tiny craft is— for example, the ODROIDTM computer mounted to the wing has eight cores. For reference, the average laptop has four. It can collect information straight from the ground-facing webcam, then run all of the visual odometry calculations onboard, without even turning its fan on. All this from a computer smaller and lighter than a chocolate bar.

"I've even been developing on this thing," said Alex as he slid the ODROID into its custom 3D-printed mount on the Mars Flyer. "It's amazing."

The components stay in place with Velcro in addition to the plastic mounts.

Once all the Mars Flyer's components were in place and sending data to Alex's desktop, Zak got it situated for takeoff while Alex scattered some of his special feature-rich mats (designed to ensure that the visual odometry algorithms have something to pick up on) around the operational flight area.


After takeoff, Zak flew the Mars Flyer back and forth high while Alex manned the desktop and watched the information stream in. Here, Alex points out where the visual odometry algorithm is marking features on the video feed, as data points flood into the terminals in the background.


The first flight test was a success but Alex still has plenty of work ahead of him. Now, he says, he'll spend the next few days getting ready to flight test a different visual odometry algorithm.

2015-06-24: Autonomy Incubator Supercharges UAV Fleet With New Quadrotors

One of the Green Machines, with a 3D printed housing designed in the AI

As Autonomy Incubator Head, Danette Allen, announced during Deputy Administrator Dava Newman's visit last week, the AI will soon begin performing test flights with a fleet of four fully customized UAVs to replace the off-the-shelf Parrot AR.DronesTM that have served us well and will continue to be used for rapid development and test. However, the new quadcopters are sturdier, more powerful, and more versatile in research applications than consumer-level drones.

The main difference between UAVs sold for recreation and our customized ones, affectionately called "Green Machines," is the intent behind their design. The COTS (commercial off-the-shelf) vehicles we have are excellent for interactive demos like #DancesWithDrones because their protective cowlings and light weight make them easy and safe to handle. For example, look at Javier testing the robustness of these four AR.Drones earlier this week:


The downside of using these vehicles lies in their limitations for our research— most COTS vehicles just aren't built to fly autonomously. Their controls aren't designed that way, and they're too light to carry anything heavier than a 3D-printed banana (for scale!). With the Green Machines, researchers at the AI can write their own algorithms in a way that's intuitive for what they're doing, which is going to provide unlimited opportunities for innovation. In addition, these vehicles can generate enough lift to carry a sizable payload of hardware, and that's where things really start to get exciting.

A side-by-side comparison.
What a cool-looking piece of aviation.

What's under that little green hood? From the computer running the flight algorithm to the autopilot controlling the motors, everything is onboard. There's even a speaker on top as part of a human factors effort called intent management, which lets the vehicle announce its next move to the humans in the area. All these onboard capabilities are critical for the researchers as we move away from running our flight algorithms on a desktop computer and using the laptop to transmit instructions to the vehicle, all via Wi-Fi. This solution works beautifully as long as every link in the communications chain is running smoothly. By putting all the technology onboard, however, we are no longer challenged by comm issues. With the aerial robot generating its own instructions and no complicated lines of communication, autonomous flight becomes more reliable and efficient.

There's only one way to keep that much equipment in the air and agile, and it's by laying down some serious thrust with those propellers. Safety is already a top focus of the Autonomy Incubator, and with these more powerful vehicles, that focus will only intensify. Even now, everyone in the building remains behind the net when something bigger than an AR.Drone takes flight. If the researchers conducting the flight absolutely must be in the indoor flight range with the Green Machines, they'll be wearing safety goggles, possibly a hard hat, and working from a safe distance away. 

But this progress brings a pressing question to the surface: what will become of #DancesWith Drones once the Green Machines move in?

"I'm not standing in front of it!" #DancesWithDrones creator Gil Montague said, laughing. "There will be a different obstacle. Maybe we'll get a rover and tape a cardboard cut-out of me to it."

Even though this new era will mean losing his place in the spotlight, Gil is confident that we only have better and more spectacular things to expect from the AI in demonstrations to come.

"[The Green Machine] opens up a whole realm of more compelling demonstrations," he said.


Wednesday, June 24, 2015

2015-06-23: Autonomy Incubator Profiles: Nicole Ogden

Nicole and her fellow intern, Curt Feinberg, love the AI's homey touches.

Nicole Ogden is a Master's student in Electrical Engineering at North Carolina A&T who works with the Autonomy Incubator (AI) in conjunction with her internship at the NIA. But, she's also the AI's version of the Most Interesting Man In The World: she spent nine months working working at a web development company in Belfast, Ireland and traveling at least once per month. So far, she's traveled—solo!—to Berlin, Dublin, Pompeii, Rome, Paris, Barcelona, Madrid, Brussels, Amsterdam, and Edinburgh, and those are just the cities she had time to mention in our interview.

"I travel because of the history," she said. "These are the type of history lessons I can't get in school. Also, after coding all day every day at my job, my brain needed a break!"

When she's not working on her project—engineering a damage-proof system of communication for UAVs to make them more durable in the field—she loves to tell stories about her travels. She's lived through enough bizarre encounters and touching moments to keep anyone entertained for hours; she even plans on writing a book someday about her life abroad. Included, she said, will be some of the 3,000 pictures she took during her adventures.

We spoke to Nicole for an afternoon about her utterly fascinating life. Here, for your benefit, are the highlights and takeaways of what she taught us:
  • The Guinness brewery in Dublin has a 9,000 year lease on its building.
  • When visiting a non-English speaking country, it's courteous to learn at least a few phrases in the language first. It's easy; she said, "I just used a few free apps on my iPod."
  • Nutella. Nutella, Nutella, Nutella. NUTELLA.
  • The best way to make friends when alone and abroad is to sign up for walking tours. "All those people are by themselves too," Nicole said.
  • You can tell who the Americans are at the airport because they're the only ones who take their shoes off at airport security.
  • Scotland has the best food in the British Isles, but Ireland has better weather.
  • There's a piece of graffiti on the Peace Wall in Belfast of a soldier pointing a gun at the viewer, and it's called "The Belfast Mona Lisa."  Whatever angle you look at it from, the gun seems to be pointing at you.
  • In Europe, key indicators of Clueless American Tourist Syndrome include white socks, fanny packs, maps, bright clothes, and smiling too much. Tone your appearance down, and you're less likely to be hassled.
Although she has already roamed far afield of her home in Buffalo, New York, Nicole says she plans to visit more countries after she graduates.

"I definitely want to go back," she said. "I want to see Northern Africa too."

Given her globetrotting spirit, the Autonomy Incubator is thrilled that she's landed in Hampton this summer because of what her research means for the future of autonomy. Currently, if a UAV's communication component is damaged while out on a mission, there's only one outcome available— freeze and crash. Clearly, such a glaring vulnerability can make even the smartest of robots impossible to use in adverse conditions. For Nicole, the challenge is clear: augment the communication system to be resistant to failure. She's tackling the problem from two different angles—physical and systematic— and intends to incorporate both into her thesis.

Whether it's reliable unmanned systems or travel advice you want, Nicole Ogden is the person to find.







Monday, June 22, 2015

2015-06-19: Autonomy Incubator Seminar Series Welcomes Professor Missy Cummings



This month, the Autonomy Incubator (AI) was thrilled to have Dr. Missy Cummings as the guest speaker in its eponymous monthly seminar series.  The title of her presentation was "Man Versus Machine or Man + Machine?" and she touched on a variety of topics from driverless cars to the semantics of "machine learning"

Dr. Cummings is quite literally a woman who needs no introduction; in addition to earning a place in history as one of the Navy's first female fighter pilot, her work with semi-autonomous flight has led her everywhere from an appearance on The Colbert Report to a TEDMED talk. You can watch her excellent 2013 guest spot on The Daily Show right here:


After ten years as a professor at MIT, Dr. Cummings recently moved to Duke University to lead the flagship Human and Autonomy Lab, or HAL. (Yes, she says; that's a pun.) Today's talk, "Man Versus Machine or Man + Machine," grappled with that junction of humans and autonomous machines.  How can the two best live in cooperation? How do we determine, in her words, "what should computers do and what should humans do?"

Cummings is a vocal advocate for a future in which humans and machines work together and she's working to show that this future is possible. Her team at MIT designed an interface that allows anyone to fly a semi-autonomous UAV with only three minutes of training. Humans, she said today, will always be part of the work that machines do because we have the capability to gain expertise, while machines won't be able to do the same for the foreseeable future. With expertise comes the ability to react to certain situations in ways that machines wouldn't necessarily know how to handle correctly. In the same vein, she disapproves of labels like "artificial intelligence" and "machine learning" for autonomous machines, because she views them as misleading.  What some call "intelligent" machines are just really good at pattern detection and rule following, she opined, not really capable of independent decision-making like a person would be.

Her work in recent years has included myriad applications of automation and autonomy, from making military IED detection trucks remote-controllable to keep soldiers safe, to helping park rangers in Gabon keep track of elusive forest elephant populations from the air, to fully automating the dump trucks in an Australian mine to prevent human-caused accidents and injury. Dr. Cummings is also contributing to the development of the Google driverless cars, although she herself remains skeptical of the idea: "Driverless cars scare me the most," she said, laughing.

After her talk, Dr. Cummings joined the AI team and colleagues for lunch in the Langley cafeteria followed by a tour of the high-fidelity flight simulators in Building 1268.  The day culminated with the AI team at the Autonomy & Robotics Center in Building 1222, where Javier and Bilal successfully increased the number of UAVs in their collaborative trajectory flight demo from four to six!

Friday, June 19, 2015

2015-06-18: Autonomy Incubator Shows off New Lidar

MARTI student interns use the lidar to spell "ACMY."
Not sure what that is, but we've heard it's a fun place to be.

The Autonomy Incubator (AI) recently acquired a Fotonic E70 lidar. It works similarly to radar, but whereas "radar" stands for "Radio Detection And Ranging" and uses radio waves to sense objects, " lidar uses light— in this case, from low-intensity lasers.  Lidar can be used to measure distance (among other things!) and when the data points from many laser scans are combined, the resulting point cloud is a clear picture of the surrounding area.

One of the features that gives lidar an advantage over optical cameras is its indifference to sunlight.  Some sensors are challenged by changing lighting conditions but some lidar systems work equally well at high noon as in the middle of the night.  Attached to a UAV, a lidar sensor could become part of a visual odometry system that allows the vehicle to see and react to obstacles in real time. Lidar can be combined with other sensor types for multi-modal sensing. For missions like search-and-rescue, for example, an additional thermal camera could allow the vehicle to detect body heat. 

The Fotonic E70 lidar is currently mounted on one of the AI's Hex Flyers.





Thursday, June 18, 2015

2015-06-17: Autonomy Incubator Prepares for Deputy Administrator Visit


Our makeshift "cul-de-sac," complete with flowers and shrubs
As tomorrow afternoon's visit from Deputy Administrator Dava Newman draws ever nearer, the Autonomy Incubator (AI) team has been working with single-minded focus to get each "act" of the demonstration running smoothly. While the PIs split their time between readying their own short presentations and contributing to the live UAV presentation rehearsal, the AI student interns take a break from their research to put NASA and AI decals on quad-rotor cowlings and film test runs on the camcorder.

Perhaps the most impressive aspect of the already incredible UAV presentation is just how seamless it is— tree-dodging flows into package delivery and then into "Dances with Drones", which flows straight into multi-UAV coordinated flight.  This demonstration is intended to take every dimension of the AI's research portfolio and incorporate it into one cohesive narrative, with help from a full set complete with three prop houses and an entire forest of fake trees and flowers. We might be biased, but we think this is going to be the most impressive—and most difficult— demo we've ever done.

Theatrics aside, look at how industriously everyone in the lab is working! The atmosphere is intense but excited; it feels like aerospace Christmas Eve here in Building 1222.

Loc and Gil make some network adjustments
Jim and Garry discuss how to best showcase the lidar
Michael and Lauren cut out the paper decals for the UAVs
Jeremy outfits a UAV with official NASA and AI decals












Wednesday, June 17, 2015

2015-06-16: Autonomy Incubator Upgrades Tree-Dodging Demo


As the entire AI team prepares for NASA Deputy Administrator Dr. Dava Newman's inaugural visit to NASA LaRC this Thursday, Dr. Loc Tran has built a veritable artificial forest in which to present his autonomous obstacle avoidance—colloquially "tree-dodging"— machine learning research. So far, nine Christmas trees, three potted palms, one fern relocated from the lobby and one ficus have sprouted up in the indoor flying range.

While obstacle avoidance has always been an area of interest for the Autonomy Incubator, as our Dances With Drones demonstration shows, Loc's work is especially amazing because of just how much autonomy his UAVs command. In addition to path planning, his program also uses information from a front-facing camera on the UAV learns to detect features as it flies, and then it avoids obstacles based on those features. If the UAV makes a mistake, Loc goes over the video and corrects the vehicle's responses. These corrections are used to create a new model, the UAV flies again, and the process gets repeated until the vehicle can safely maneuver through the course. Eventually, with enough training, the UAV will become so skilled that it can navigate a course it's never seen before. Literally, Loc teaches the UAV how to avoid obstacles, and it learns from his guidance.   

During Thursday's exhibition, the UAV will take off with a package (most likely a 3D-printed banana - for scale, of course), autonomously navigate the forest, transition into a "neighborhood", and then... 

What, did you think we were just going to put all the juicy details of our coolest demonstration ever online? Absolutely not. Check back tomorrow for more pictures and more information about the AI's plan's for Deputy Administrator Dava Newman's visit.

Monday, June 15, 2015

2015-06-15: Autonomy Incubator PI Profile: Anna Trujillo


Anna's area of expertise in the Incubator's push for autonomy is human factors, or examining the ways humans and objects communicate. It sounds abstract, but as she explains, it's a concept that applies to pretty much everyone, every day: "Think about getting on an elevator," she said. "You push a button to tell it where you want to go. How do you know it registered your choice? The button lights up."

So essentially, she researches the different ways that humans interface with autonomous robots and seeks to make those interactions as intuitive and efficient as possible. Luckily for everyone, Anna has a clear vision of what a future where humans and robots work side-by-side will look like.

One goal for the future, Anna said, is to have one human operator be responsible for multiple UAVs. In current day operations, one UAV requires multiple people ensure that each flight goes well: there's the pilot, plus a spotter, plus however many auxiliary people the particular mission might require.  It would be so much better, she said, if that ratio of people to robots were reversed.

"There will still be someone who has to hit the "go" button," she said, but once the machines become fully autonomous, they won't need the operator unless they have something to report or they encounter a problem that they aren't sure how to solve.  "They can say, you know, 'Hey, we have a question!'" she said, and the human operator can step in and help them.  Otherwise, the robots will only need the operator to give them a mission and send them on their way.

This new dynamic might seem like it would limit human interaction with robots, but once the operator releases them into the world, there are a myriad of ways that autonomous UAVs and people can work together.  For example, UAVs can sweep large areas of land in a search-and-rescue mission, then call for help once it finds the target.  A similar instance of teamwork is already happening in police departments with ground vehicles (as we found out last month); a tele-controlled police robot can travel into a house and allow its human coworkers to see whether it's safe to enter or not. Once police robots gain a degree of autonomy, they'll be able to enter a house, clear the different rooms, and then signal to the team outside that it's safe to enter, all without input from its operator.

Even people without professional training will someday be able to comfortably interface with autonomous machines. Anna brings up the recent buzz in Japan around "helper robots," humanoid robots designed to fill the growing care needs of its elderly population. Right now they remain relatively limited in what they can do, but eventually, these machines will be so autonomous that if someone says "I want a cup of coffee," they'll go into the kitchen, get a mug, fire up the coffee machine, and gently deliver a hot cuppa to the person who requested it.

Japanese ROBEAR nurse robot, able to carry wheelchair-bound patients. (source)

"You don't have to say, 'okay, here's the kitchen, now go there, now open the cabinet door, now find the second shelf, now grasp the mug without crushing it...' It just does it. It has the necessary knowledge to do it just like a human aide would," Anna said of the forthcoming species of domestic robots.

Natural-language interface—telling the robot "I want coffee" like one would a waiter— between human and robot is another major focus of human factors.  The ultimate idea is to abstract the interface into something intuitive for humans, while letting the artificial intelligence do all the tedious plugging in of commands and path-finding.

No matter how naturally we are able to converse with robots, though, none of it matters if we don't feel at ease interacting with them. That's why, Anna emphasizes, it's crucial to have "trust and transparency" in human-robot interactions: for a person to accept the robot as safe, she needs to be able to see and understand the robot's reasoning.  Natural-language interface helps with this, as does carefully choosing what level of information the robot relays to the human on a regular basis— too much information is overwhelming, too little can feel uncomfortable.

Human factors research, like humans themselves, is complicated and often messy. With Anna Trujillo at the helm, our relationships with robots don't have to be.









Friday, June 12, 2015

2015-06-12: Autonomy Incubator Sets In-Lab Autonomous Flight Record for Multiple UAVs


Yesterday, the intrepid Autonomy Incubator (AI) team comprised of student interns Javier Puig Navarro, Bilal Mehdi, and Gil Montague dared to fly where others fear to tread.  They pushed the limits of both technology and ingenuity, went toe-to-toe with the impossible, and pulled off some of the most white-knuckle science the AI has demonstrated to date.  Yesterday afternoon, at 4:38 PM, this band of robotics renegades flew five—FIVE— UAVs simultaneously, using a tracking system and a few well-written trajectory collaboration algorithms to guide them up into a hover and back down again.  

At first, the dream of pulling off such an ambitious flight seemed dogged with snags and hang-ups: from a bad prop on one of the vehicles to troubles calibrating the UAVs' positions, the lead-up to takeoff took hours.  Here's Bilal holding a UAV in the flight operational area, testing different pitches, while Javier works on the computer out of frame.  


But then, finally, the epic moment arrived and all five robots ascended, hovered in place for a while, and then gracefully descended in coordinated flight.




Why, you ask, was this flight so important? We've been flying solo UAVs for years; surely a few extra is nothing to write home about. Actually, coordinated flight is one of the frontiers of UAV applications, and it becomes exponentially more complicated as more vehicles join in.  If autonomous vehicles, especially UAVs, can be managed reliably in groups, the benefits to science as a whole would be astonishing. Think of a volcano too dangerous to approach, or a remote spot in the Amazon inaccessible by plane— with a team of collaborative vehicles in the air, scientists will be able to employ multiple sensors at once, perform atmospheric testing at multiple altitudes at the same time, even replace research balloons with a method that both provides more data for longer and doesn't generate waste.  And these are just some of the applications we are targeting today; who knows what kind of research will become possible tomorrow using coordinated fleets of UAVs?


2015-06-11: Autonomy Incubator Tests Mars Electric Flyer Prototype

Jim Neilan crouches as he approaches the Mars Flyer so as not to startle it

It's a copter! It's a fixed-wing! No, it's... what is that?

Definitely the most unusual vehicle to grace the Autonomy Incubator recently, the Mars Electric Flyer represents an entirely new genre of UAVs: flyers designed for use on other planets. The Mars Flyer, as its name would suggest, has been painstakingly crafted to fly in the thinner air and lower gravity of the Red Planet.

"Thinner air" is actually a pretty mild term for the conditions where the Mars Flyer is going: the atmosphere on Mars is one hundred times thinner than the Earth's. Such a drastic change in the environment calls for an innovative approach to flight. It's built to fly extremely fast and catch as much lift as possible in order to stay aloft in the minimal atmosphere.

"The design philosophy of the Mars Electric Flyer VTOL is to keep it EXTREMELY simple and lightweight. So we use as few moving parts as possible," said Dave North, the SACD project lead for the Mars Flyer.

In contrast to the current prototype, meant for sea-level testing on Earth, the real Mars Flyer will have props nearly three feet in diameter and will reach speeds up to 200 mph, after it switches from vertical takeoff mode to forward flight mode.  During its four or five minute flights, the Flyer will collect data about the surrounding area before docking onto the Mars rover, downloading its data, and charging with solar panels for the next Martian day.  

All this is terribly exciting, but where does the Autonomy Incubator come in? The Incubator is on board to provide the Mars Flyer with autonomous capabilities. Radio signals take twenty minutes to travel from Earth to Mars, which makes tele-controlling it infeasible. That means that when the Mars Flyer is flying at blisteringly high speeds through the Martian atmosphere, it's going to have to be able to position itself and plot its own path in order to complete its missions.

Jim Neilan and Paul Rothhaar are the PIs in charge of the Incubator's role in the Mars Flyer Project, with student intern Josh Eddy supporting them. The recent tests at the Incubator, Jim said, are intended as "proof-of-concept" for the flight algorithms that the Incubator has been working on. The results of the tests will provide valuable insights into whether the technology we have available now is enough for the Mars Flyer to navigate an unknown environment and report data about that environment back to mission control. 




Wednesday, June 10, 2015

2015-06-10: Autonomy Incubator Bids Farewell to Charles Cross



Today marks a bittersweet occasion for the Autonomy Incubator (AI) team as one of its founding members, Charles Cross, departs from Langley to pursue new opportunities.  Charles, the lead software engineer responsible for archtecting the Incubator's Autonomous Entity Operations Network (AEON), will be taking a position at the start-up OpenROV in Berkley, California.

During his sixteen-month tenure with the Autonomy Incubator, Charles focused on designing a core development framework that could connect PI-developed capabilties across platforms and programming languages.  With AEON, researchers can write applications that can connect to and communicate with other applications within the lab right from the start, which allows for easier collaboration. By design of the AI hiring strategy, Charles comes from not an aerospace, but a gaming background, and a deeply entrenched one at that.

"I've been programming since I was nine," he said. According to him, his years of experience designing games was crucial to his process when he was designing—and re-designing— AEON, because both involve a focus on communication between machines.

Charles also focused some of his time at the AI on educating the team, especially interns, on good programming practices.  Here he is teaching the summer 2015 interns how to use DDSTM to create an application within our framework yesterday:



While everyone at the Autonomy Incubator will miss him, Charles is moving on to some exciting mission applications. At OpenROV, he'll continue to do similar work to what he's done here, but with underwater vehicles instead of UAVs and rovers.  In addition, he plans on doing community outreach with the company to make underwater tele-exploration "accessible to pretty much everyone," he said.


2015-06-09: Autonomy Incubator's #DancesWithDrones To Expand With New Set Pieces



The #DancesWithDrones hashtag has been one of the darlings of the Autonomy Incubator's Twitter feed for months now, and with good reason— it's an amazing demonstration of what autonomous vehicles can do and how safely they will someday integrate into our daily lives. 

But, just what is #DancesWithDrones? Picture the scene: A UAV is flying back and forth, as if ferrying a package from a delivery truck to the front porch of a house. But suddenly— oh no! A person wanders into the path of the vehicle as it barrels toward the house! Disaster will surely ensue, right? Not so, thanks to the work of the Autonomy Incubator team and student intern Gil Montague. Rather than colliding with the person, the UAV will detect and avoid (safely go around) them if there's sufficient space to do so, or stop and hover in midair until its path becomes clear again.  Here's a video of a demonstration that Gil did just last week for NASA's Associate Administrator for Education:


The nickname "Dances With Drones" comes from the way Gil dodges and zigzags around the vehicle like they're partners in a tango. 

This demonstration is fun to watch, absolutely, but it's also a symbol of the massive amounts of work that the Incubator is putting into a side of robotics called path planning.  Essentially, the path planner program uses input about the robot's position and surroundings— in this case, from the ViCon TM— to, well, plan a path for the robot from its current position to its goal while avoiding all obstacles in the environment.  When the obstacles move, the path planner changes the path. That's where we get #DancesWithDrones: Gil, the obstacle, forces the path to change every time he blocks the UAV from its goal.  

The prop "house" that arrived today is the first of three houses that will soon populate the incubator as part of a new challenge. The UAV must now not only find a path from the truck to the house, but also choose the right house on the "street" based on human input like the address or even the color of the house, all while continuing to avoid moving obstacles. This new version of #DancesWithDrones serves as a hopeful precursor to a future in which autonomous machines, independent of systems like the ViCon, can navigate safely and efficiently in the real world.




Tuesday, June 9, 2015

2015-06-08: Autonomy Incubator Runs Tests on Visual Odometry System




It's an exciting day here at the Autonomy Incubator, as intern Alex Hagiopol runs the first tests on what will become a GPS-independent localization system.  The end goal of this research is to be able to have the level of control over robots that we achieve now with outdoor positioning systems such as GPS or indoor tracking systems such as a ViConTM.

Perception remains one of the hardest problems in robotics. Currently, the solution to that problem is the ViCon system: a circle of near-infrared cameras that pick up on metallic balls attached to the robot's body, then transmit data about those points' position and orientation 200 times per second. So good is this system at telling where objects are, and so good are scientists at using it, that incredible UAV acrobatics become possible-- just look at this TED Talk from 2013 about the "athleticism" of quadcopters:



With rapid and precise tracking systems, UAVs can be controlled to do backflips in an auditorium, but capabilities are limited in terms of real-world implementations of autonomous machines. This, then, is the problem of robotic perception: achieve the same level of precision and control, without using an external positioning system.

The Autonomy Incubator's team is tackling the perception problem by leveraging an approach called SVO, or Semi-direct Visual Odometry.  Developed last year at the University of Zurich, the main computer receives video from a camera on a UAV, recognizes "features" in the robot's surroundings, attaches reference points to them,  and calculates speed, position, and orientation based on how the robot moves relative to those points.

The surest way to test SVO's accuracy is to compare it to "the truth" as generated by, in this case, the ViCon system. Today, Hagiopol took his webcam setup inside the ring to compare the accuracy of this visual method against the tried-and-true results from the infrared cameras. Notice the metallic balls attached to the camera rig used to track its movement through the observable operational area for data comparison.



Alex used a patchwork of foam tiles as his test surface, arranged carefully to provide a highly contrasted environment of features for the algorithm to recognize. Here, we can see the way the SVO algorithm picked up on features on the mat—



—and what those features looked like as points in the algorithm's map of the surroundings.  The line in blue shows the path Alex walked, which the algorithm derived from his movement in relation to the points. Very, very cool stuff.













Monday, June 8, 2015

2015-05-28: Autonomy Incubator Visits Alexandria PD



The Autonomy Incubator team made an thrilling excursion into the outside world to meet the people and robots that make up the Alexandria Police Department's Special Operation Team Technical Services Unit's (TSU) Tactical Computer Section (TCS). Their work is as complicated and impressive as their name would suggest: TCS uses ground robots to handle a wide range of high-pressure scenarios for the Alexandria Police Department, from searches to hostage situations.  Sergeant James Craige and Sergeant Tim Kyburz, both of TCS, demonstrated three of the department's robots, highlighting the machines' current capabilities before outlining their ideas for autonomous elements in future versions. While it's easy to think of autonomy in terms of delivering packages and collecting data, robots with the capability for independent decision-making also represent an exciting possibility for the future of public safety-- imagine a robot with the ability to not only enter a house to search for hostages, but also to recognize a doorknob and use it to open the front door without waiting for explicit, step-by-step instructions from a human.  After our crew observed the TCS robots and the support network required to keep them operational in the field, the two teams agreed to stay in contact to explore collaborative opportunities.

While much was learned and we are excited about future collaboration, this trip was doubly exciting because it marked the Autonomy Incubator's first outing with Lauren and Michael, our new Aero Scholar interns. You can see Lauren and half of Michael in the selfie above. Also pictured are Gil, Loc, Jim, Danette, and Anna.

Friday, June 5, 2015

2015-06-03: Autonomy Incubator UAV Test and Calibration Flights


Our resident UAV pilot, Zak Johns, performed test flights on both the Incubator's Hex Flyer and one of his own professional cinematography rigs to ensure that new hardware on both flyers functioned smoothly. The excitement drew an enthusiastic crowd of both interns and staffers alike, so Zak brought his camera gimbal to the other side of the net for a hands-on demonstration.  A gimbal makes sure the camera stays steady and the shots come out smooth while in the air-- no matter what maneuvers the vehicle does, the gimbal adjusts in every direction to keep the camera level. The interns tested this hypothesis, like the engineers they are, by shaking and swiveling the gimbal with great gusto while everyone watched a live video feed from the camera on Zak's iPhone.  Nothing the human testers dished out could make the video so much as jitter.  Once the interns had exhausted themselves, the gimbal was returned to the UAV for its test flight.

Here's a clip from the camera on top of the Hex Flyer during its flight. Viewers who get motion sick easily may choose to look away; Zak pulls some spins in here that make the roller coasters at Busch Gardens look like merry-go-rounds.