Wednesday, May 3, 2017
2017-05-03: Abbey Hartley Intern Exit Presentation Spring 2017
Oh, you know the drill. It's me, Abbey, talking about all the amazing and varied ways that social media lets me talk about robots on the internet. Featured this time: our brand new and growing Facebook page, the endurance of Twitter, and projections for the future.
2017-05-03: Nick Woodward Intern Exit Presentation Spring 2017
Nick Woodward, who had his first internship with the Autonomy Incubator in 2014, gave his final exit presentation after this year's spring session. Although he's done everything from geofencing to setting up 150 CICADA gilders from the US Naval Research Laboratory over his years in the Ai, his presentation this time focused on the trade studies and computer-assisted design (CAD) work he took on in support of multiple Ai missions. His creations included a research plate prototype to mount sensors on our in-house UAVs, a comprehensive analysis of retrieval solutions for UAVs on over-water missions, and new landing gear for the DELIVER project that PI Loc Tran is working on in collaboration with MIT. Nick has been a fixture of the Ai since its inception, and we already miss him dearly– especially the interns who must now take their coffee breaks without his witty banter.
Tuesday, May 2, 2017
2017-04-25: Autonomy Incubator Head Dr. Danette Allen Presents At Uber Elevate Summit
![]() |
Danette plays a video demonstrating our autonomous mission planning and execution capabilities. |
"So, why do we want to advance beyond that to autonomy?"
The question struck at the heart of the Uber Elevate conference in Dallas this week. Scientists, engineers, tech CEOs and more from all over the world congregated for two days of lectures and panels about autonomous flight and its implications for the future of travel. But, why is Uber, known for its sharing economy approach to car transportation and, more recently, self-driving car research, suddenly so interested in flight?
That's right. The future of ridesourcing might just be autonomous, people-carrying UAVs, and the Autonomy Incubator is already a part of the conversation.
Danette was featured as a panelist on the "Urban VTOL Autonomy Decomposition Panel," alongside Near Earth Autonomy CEO Sanjiv Singh, Carnegie Mellon University systems scientist Sebastian Scherer, and GAMA VP of Global Innovation and Policy Greg Bowles. NASA Langley's own Ken Goodrich moderated the session.
Her talk focused on the nature of autonomous decision making, and why the human-machine interface will remain a critical part of fully autonomous systems in the future.
Danette opened by clarifying the difference between automation and autonomy: "The path from automation to autonomy is not a continuum," she said, meaning autonomy cannot be achieved through more and more automation. Dictating how a machine performs a task (automation) and enabling the machine to decide whether to and how to do the task (autonomy) are two very different concepts, and must be treated as such.
"It's the difference between autonomy in execution and autonomy in decision making," she said.
![]() |
A slide from Danette's presentation. |
Then, she used a video of NASA JPL's ALHAT (Autonomous Landing and Hazard Avoidance Technology) to segue into autonomous planning and decision making.
ALHAT on "Morpheus Free Flight 12"
"Most systems today can do what we tell them to do, but a lot of that testing happens after the decision is made," she explained. But what about judgement? What about autonomy in decision-making?
Understanding the autonomous decision-making process, like how a self-driving car might choose between swerving into the median or hitting a person, is imperative as we define a basis for certification towards what systems are certifiably "safe," Danette said. To call a system safe, we must be able to assess if the decisions it makes are "correct" - both objectively and subjectively. That, she posited, is why human-machine interaction will always be a critical part of even the most advanced autonomous systems of the future.
"There has to be a human interface in autonomous systems. because ultimately, it's humans who make the certification decision," she said.
Danette concluded by showing a video of the virtual reality capabilities we recently debuted, which enable a novice user to set and adjust flight trajectories in a risk-aware environment. Here's a look at what it can do:
Zak Johns, the Ai's UAV pilot, was also at the conference and attended the Urban VTOL panel. According to Zak, Danette and the Ai became one of the highlights of the afternoon.
"She was definitely the best on her panel," he said.
Tuesday, April 11, 2017
2017-04-11: Autonomy Incubator Implements Novel Tethered Flight Solution
The tethered Hive UAV in flight |
Tethers. They let us conduct quick outdoor flight tests without the paperwork involved with untethered UAV flight, but logistically, they can be complicated— string and propellers do NOT mix, so keeping the tether under control is imperative. For smaller vehicles like the ones we flew during our outdoor package delivery mission last summer, our solution was to tether the vehicle to a fly fishing rod and follow it around:
Pretty clever, right? However, now we're working with The Hive, a vehicle with twice the weight and over double the thrust of the quadrotor we used last year. The fishing line is too thin to hold this machine, so we have to use nylon cord. And without the finesse and precision of a fishing reel to control the line, the cord is in danger of getting blown up into the propeller and damaging the vehicle.
Enter Zak Johns, the Ai's expert drone pilot and master problem-solver with a solution.
Zak Johns inspects the Hive UAV, ozone sensor payload, and tether |
By feeding the tether through a length of flexible tubing attached to the body of the UAV, Zak ensures that the cord always stays well clear of the props— even during takeoff and landing, when the air under the vehicle is the most turbulent.
The winds today were gusting at 35 mph, and it still performed perfectly. |
Now that we have a simple, easy way of managing the tether for large UAVs, our outdoor flight abilities just became way more agile. Like for our OWLETS atmospheric science mission coming up next summer! Check it out here to see what we're doing.
Friday, March 31, 2017
2017-03-29: Autonomy Incubator Hosts Chief Technology Council and Ft. Eustis Commander of 1st Fighter Wing
![]() |
Dr. Loc Tran demonstrates object classification on the crowd. |
This was a big week of demonstrations at the Ai, with two groups of brilliant minds coming through to learn about our autonomy research. Tuesday brought a crowd of officials from NASA's Chief Technology Council (CTC) and Chief Technology Office (CTO), while Wednesday, we demonstrated for Army Colonel Pete Fesler, Commander of the 1st Fighter Wing (ACC) Joint Base Langley-Eustis.
![]() |
Ai head Dr. Danette Allen addresses the CTO delegation. |
![]() |
Danette explains our motion capture system to Col. Fesner. |
DWD is always a crowd-pleaser. After Ben finished demonstrating the real-time path replanning algorithm, Col. Fesner exclaimed, "I didn't know we could do that!"
Next, Dr. Loc Tran gave an overview of our computer vision work. He combined the 3DEEGAN convolutional neural network and the work he's doing with MIT into one demonstration of object recognition and classification.
Finally, we showed our guests a scaled-down section of our payload delivery mission from last summer: one of our large quadrotors took off, autonomously navigated to the ozone sensor site, and then used a downward-facing camera with a natural feature tracker algorithm to locate and collect the sensor. Here's the video of the whole mission if you want to see one of our moments of glory.
We were thrilled and honored to share our mission with so many visitors this week. Congrats to the whole team on a flawless pair of demos!
Monday, March 27, 2017
2017-03-27: Maiden Voyage of the First OWLETS Vehicle
Zak Johns assembles the Hive vehicles in front of the Lunar Lander Research Facility. |
Mounting the ozone sensor to the top of the Hive |
Because the sensor package is uniquely shaped compared to our previous research packages (it's tall, and mounted on the top of the vehicle instead of the bottom), we ran these test flights as very short missions to observe the vehicle in-flight. If the sensor affects the flight behavior of the vehicle at all, it's important to find out now so we can mitigate it right away. Longer tests are scheduled for next week, but so far, the vehicle flies beautifully.
![]() |
Look at that stability! |
![]() |
NASA scientist Guillaume Gronoff examines the ozone data. |
Friday, March 24, 2017
2017-03-23: See How We 3D Print Our Way To Better UAVs
Ben Kelley, a small GPS chip in his hand, approached Nick Woodward's desk.
"This is going to go on a vehicle, but I can't find a case for it anywhere online," he explained, handing over the chip.
"I can do that," Nick said.
Nick, an intern and recent grad of Worcester Polytechnic Institute, has many jobs around the Ai, but chief among them is what he calls "the CAD guy." Thanks to what he's accomplished in his three-year stint in the Ai, CAD (computer assisted design) and 3D printing have become a part of everyday operations– which means faster prototyping for our whole lab. Where before people would have had to go outside the Ai to get custom components designed and fabricated, Nick can whip up a quick plastic prototype in less than half a day.
"It provides us with a relatively new and unique advantage of being able to rapidly prototype multiple options for a given solution," he added. So, if he has more than one idea about how to approach a problem, he can just print up both and see which one works best.
On occasion, Nick sends a successful prototype out to the NASA Langley printing lab because "their abilities for production are way greater than ours... you're looking at a machine the size of a queen bed and about 8 ft tall." Today though, he planned to put his creation straight from our printer onto the vehicle.
It's all measurement taking to start with. Nick uses calipers and takes down the exact dimensions of the chip, noting outlets and screw holes, while he thinks about how to create the best case for it. In less than an hour, he's got a completed first draft.
"If there are multiple parts," he said, "I make sure everything fits together as intended in simulation and then print it out."
Our tabletop printers are extruders, which means they essentially function like very precise hot glue guns– they feed a spool of solid PLA plastic into a heated tip that applies the melted plastic to the work surface in thin layers. The machines are very fast, but also very delicate. Before Nick can fire up the print, he has to make completely sure that the machine is in full working order. This includes cleaning the extruder: cranking the temperature up as high as it goes, letting it cool, and then extracting the purged excess plastic with a pair of pliers.
"What are you doing now?" I asked as he ducked his head practically inside the machine.
"Calibrating," he said. If the extruder head isn't exactly the right distance away from the work surface, or if the axes aren't tuned just right, the print will come out off-kilter. Same for if there's any dirt at all on the surface– before starting the print, Nick has to buff it several times with a paper towel.
"If there's anything between the plate and the PLA when it goes down, it can pull away from the plate mid-print and ruin the print," he explained. There are other ways of making sure your print sticks to the plate– you can rub the printing surface with a glue stick, or coat it with a layer of painter's tape– but Nick prefers to just keep the environment as clean as possible.
Once everything is ready to go, Nick starts the print and walks away. There's nothing to do now but wait.
"Ideally, you wanna make a print that you can put on in the morning and pick up before close of day, from a prototyping standpoint," he said. The time a component takes to print is dependent on not just the size of the piece, but also the "infill"— basically, how dense the inner honeycomb structure of the print is. Lighter infill pieces print faster, but won't be as hardy. It's a balance Nick has to consider whenever he makes a new prototype.
"I had a twenty-six hour print once," he added. It was the cone and tetrahedron package delivery system he created last summer.
Once the print is finished, Nick gently pries the pieces off the printer bed and hands them over to Ben. Here he is, assembling the case around the GPS chip.
This first draft turned out pretty well, but Nick said he's doing revisions for a second iteration.
"The inside is about two millimeters too shallow," he said, "and I need to research closing mechanisms that don't require screws." Luckily, a new and improved component is only a few minutes of CAD work and an hour of printing away.
Subscribe to:
Posts (Atom)