Wednesday, May 3, 2017

2017-05-03: Abbey Hartley Intern Exit Presentation Spring 2017



Oh, you know the drill. It's me, Abbey, talking about all the amazing and varied ways that social media lets me talk about robots on the internet. Featured this time: our brand new and growing Facebook page, the endurance of Twitter, and projections for the future.

2017-05-03: Nick Woodward Intern Exit Presentation Spring 2017



Nick Woodward, who had his first internship with the Autonomy Incubator in 2014, gave his final exit presentation after this year's spring session. Although he's done everything from geofencing to setting up 150 CICADA gilders from the US Naval Research Laboratory over his years in the Ai, his presentation this time focused on the trade studies and computer-assisted design (CAD) work he took on in support of multiple Ai missions. His creations included a research plate prototype to mount sensors on our in-house UAVs, a comprehensive analysis of retrieval solutions for UAVs on over-water missions, and new landing gear for the DELIVER project that PI Loc Tran is working on in collaboration with MIT. Nick has been a fixture of the Ai since its inception, and we already miss him dearly– especially the interns who must now take their coffee breaks without his witty banter.

Tuesday, May 2, 2017

2017-04-25: Autonomy Incubator Head Dr. Danette Allen Presents At Uber Elevate Summit

Danette plays a video demonstrating our autonomous mission planning and execution capabilities.

"It's probably no surprise to anyone in this room that NASA's been thinking about highly automated systems for years—since the sixties," Dr. Danette Allen said, addressing a room full of autonomous flight's biggest players.

"So, why do we want to advance beyond that to autonomy?"

The question struck at the heart of the Uber Elevate conference in Dallas this week. Scientists, engineers, tech CEOs and more from all over the world congregated for two days of lectures and panels about autonomous flight and its implications for the future of travel. But, why is Uber, known for its sharing economy approach to car transportation and, more recently, self-driving car research, suddenly so interested in flight?

The Uber Elevate Concept

"Air-taxi[s will] combine electric propulsion, autonomy, vertical lift and many
other communication and navigation capabilities. Fully autonomous air-taxi...
operations, especially in very populated and heavy traffic...areas, I think it’s an
exciting possibility. So when we converge all these capabilities... a lot of new
chapters in aviation are possible... it's a dawn of a new era in aviation."
  
- Dr. Jaiwon Shin, NASA AA for Aeronautics, as quoted in Uber's
"Fast-Forwarding to a Future of On-Demand Urban Air Transportation"



 
That's right. The future of ridesourcing might just be autonomous, people-carrying UAVs, and the Autonomy Incubator is already a part of the conversation.

Danette was featured as a panelist on the "Urban VTOL Autonomy Decomposition Panel," alongside Near Earth Autonomy CEO Sanjiv Singh, Carnegie Mellon University systems scientist Sebastian Scherer, and GAMA VP of Global Innovation and Policy Greg Bowles. NASA Langley's own Ken Goodrich moderated the session.

Her talk focused on the nature of autonomous decision making, and why the human-machine interface will remain a critical part of fully autonomous systems in the future.

Danette opened by clarifying the difference between automation and autonomy: "The path from automation to autonomy is not a continuum," she said, meaning autonomy cannot be achieved through more and more automation. Dictating how a machine performs a task (automation) and enabling the machine to decide whether to and how to do the task (autonomy) are two very different concepts, and must be treated as such.

"It's the difference between autonomy in execution and autonomy in decision making," she said.

A slide from Danette's presentation.


Then, she used a video of NASA JPL's ALHAT (Autonomous Landing and Hazard Avoidance Technology) to segue into autonomous planning and decision making.

ALHAT on "Morpheus Free Flight 12"


"Most systems today can do what we tell them to do, but a lot of that testing happens after the decision is made," she explained. But what about judgement? What about autonomy in decision-making?

Understanding the autonomous decision-making process, like how a self-driving car might choose between swerving into the median or hitting a person, is imperative as we define a basis for certification towards what systems are certifiably "safe," Danette said. To call a system safe, we must be able to assess if the decisions it makes are "correct" - both objectively and subjectively. That, she posited, is why human-machine interaction will always be a critical part of even the most advanced autonomous systems of the future.

"There has to be a human interface in autonomous systems. because ultimately, it's humans who make the certification decision," she said.

Danette concluded by showing a video of the virtual reality capabilities we recently debuted, which enable a novice user to set and adjust flight trajectories in a risk-aware environment. Here's a look at what it can do:




Zak Johns, the Ai's UAV pilot, was also at the conference and attended the Urban VTOL panel. According to Zak, Danette and the Ai became one of the highlights of the afternoon.

"She was definitely the best on her panel," he said.