Tuesday, February 21, 2017

2017-02-09: Autonomy Incubator Collaborates with MIT on Collaborative Search-And-Rescue Vehicles

Loc hand-flies the UAV through the search area.

As part of the collaboration between NASA Langley's DELIVER initiative and MIT's SRTC (Search and Rescue under The Canopy) project, Ai engineer Loc Tran has spent the last two days tromping through the woods in the back half of the center, a UAV held at arm's length in front of him.

"We're recording data," he said before handing me a laptop. He took the UAV on long, looping paths through the forest, while I followed behind him and watched streams of data flow in from the onboard sensors– GPS positioning information, measurements from the lidar on top of the UAV, video from the front-mounted camera. Everything the UAV would need to fly autonomously was already on board; I was just there to monitor that it was working. Houston to the Apollo 11, if you will.

My view from Mission Control.

"So, it's making maps right now?" I asked, as we passed the same bench for the third time.

"Yeah, but what's important is that we're going through the same area in different directions to match up the maps we get," Loc explained.

UAVs have had the ability to navigate and create maps for years now; look at all the PTAM and computer vision research we've already done at the Ai. This UAV is unique in that as it navigates and creates maps, it shares those maps with other vehicles navigating the same area to collaboratively create one big master map. By looping around and crossing through the search area multiple times, Loc can test the map-matching algorithm to verify that it recognizes the same topography from different angles.

"We want to be able to know where one drone has searched versus where another drone has searched," he said. In a search-and-rescue situation, time is a precious commodity. If the team of search vehicles can collaborate in their analysis instead of individually scanning for the same subject, the time saved could be crucial to a successful rescue.

The modern, less cuddly version of a Swiss Saint Bernard.
Once we made enough laps around the search area, Loc took the equipment back to the Ai and prepared the data to send off to MIT.

"This is their design; we're helping them with the operations aspect of it– testing the thing, collecting data and trying [the algorithm] on our own data set," he said. "They don't have woods where they are." Critical in an under-the-canopy search project.

Before the day was over, Loc also switched places with me so I could take the UAV for a joyride. Hand-flying is harder than it looks– the lidar has a 270° field of vision so you have to stay right behind the vehicle to keep out of the way, plus it's heavy– but the information I collected will become part of the project data set. You're welcome, America!

Me, proving that English majors really can do anything.




No comments:

Post a Comment