Wednesday, November 27, 2019

2019-11-27: Inertial Transfer Research Uses an End Effector to Catch and Grasp


Intern Justin Goddeau has been working in the realms of inertial transfer as part of the In Space Assembly research at the Autonomy Incubator.

For this demonstration, an end effector had been added to the arm so that it can actually grasp the puck on the air bearing table now.

Watch this short video to see it in action!

Tuesday, November 26, 2019

2019-11-26: MIT Students Visit the Autonomy Incubator for Continued Search and Rescue Research

Katherine, Kyel, and Yulun analyze the programming during the test.

In October, three MIT students visited the Autonomy Incubator again to continue their research with search and rescue missions.

The three students, Katherine Lie, Kyel Ok, and Yulun Tian, have visited several times in the past. In fact, you can read about one of their visits here!

Fortunately for us, the Virginia weather wasn't quite too cold yet, and we could bear the outdoors without the risk of potentially freezing.

The specific goal this time around was to test an integrated real-time exchange of information of the UAVs. The team added two new modifications that are unique to how we normally run our search and rescue tests. The first of which was the person-detection itself, and Loc Tran actually printed a life-sized portrait of his son to act as the lost hiker.

Loc's cardboard-son was our stand-in missing person.

As hysterical as that is, it was actually very successful!

The person detection was running on-board, and they were able to get some great detections of the person and even the co-pilot.

The drone successfully detected the fake Felix.

This was their first time testing with the on-board person detection, rather than off-board, so the drone was actually flying around with a camera and doing the search.

"It was a nice way to tie everything together because now it's doing the actual detecting," Loc said. "It puts a stamp on the search part of the search and rescue mission."

The second modification was "a bit more behind the scenes," as Loc said. Since they were testing with on-board map merging instead of off-board, the two vehicles were creating one map each and then aligning them together to make one unified map. This allows them to share information so that the two drones could differentiate where they've already searched and where they are in space at any given moment. This means they're no longer independent of each other but are integrated.

Before, when the LIDAR was scanning, it was making a 3D map. Now they're doing a 2D approximation and using that to align the two maps.

Other than the MIT students and Loc, there are a few others that had a huge hand in the effort. Chester Dolph has helped out a lot throughout the few years that they have been working on the project.

"It was a smashing success!" Chester said. "It was a really awesome project, and I like that it solved an interesting problem space. Getting all of these sensors to work together in real-time is very tricky, but the MIT folks developed a great code."

Pilots Brian Duvall and Zak Johns are also to thank, as well as Ralph Williams and Chris Meak.

Naturally, the group came together to take a photo at the end of the
research day.

This mission was one of five winners of the AUVSI XCELLENCE Humanitarian Award in May.