MIT’s Computer Science and Artificial Intelligence Lab (CSAIL) has unveiled NanoMap, new technology to fly drones autonomously through uncertain environments.
The use of ‘uncertain’ is key here. While most autonomous flight systems to date use some iteration of Simultaneous Localisation And Mapping (SLAM) technology, NanoMap takes a fundamentally different approach.
The output of SLAM methods aren’t typically used to plan motions. That’s where researchers often use methods like “occupancy grids,” in which many measurements are incorporated into one specific representation of the 3-D world.
SLAM-based systems rely entirely on intricate 3D maps developed from raw, high-fidelity data, but the output of SLAM methods aren’t typically used to plan motions. That’s where researchers often use methods like “occupancy grids,” in which many measurements are incorporated into one specific representation of the 3-D world.
The data needed to create these can be unreliable and hard to gather quickly enough for UAV applications, and require serious processing horsepower. UAVs fly at high speeds, which can overwhelm computer vision algorithms and cause the drone to rely on its Inertial Measurement Unit (IMU) sensor for navigation — a relatively inexact set of inputs such as the vehicle’s acceleration and rate of rotation.
This new flight control system diverges from this approach by considering the drone’s position to be uncertain — actually modelling uncertainty.
The NanoMap system still relies on 3D data from the drone’s sensors — but instead of rendering these into an ultra-precise map, they are processed as a series of much-easier-to-crunch snapshots.
Graduate student Pete Florence, lead author on a new study related to the technology, says that this approach is better suited to applications in which drones must navigate complex and dynamic surroundings.
“Overly confident maps won’t help you if you want drones that can operate at higher speeds in human environments,” he said.
“An approach that is better aware of uncertainty gets us a much higher level of reliability in terms of being able to fly in close quarters and avoid obstacles.”
This new method uses a depth-sensing system to stitch together a series of measurements about the drone’s immediate surroundings. This allows it to make motion plans for its current field of view, but also anticipate how it should move around in the hidden fields of view that it has already seen.
“It’s kind of like saving all of the images you’ve seen of the world as a big tape in your head,” says Florence. “For the drone to plan motions, it essentially goes back into time to think individually of all the different places that it was in.”
The results of the tests highlight the impact of uncertainty — the team found that when modeling uncertainty, the crash rate of the test drones reduced from 5 percent — equivalent to a crash every four flights — to 2 percent.
Sebastian Scherer, a systems scientist at Carnegie Mellon University’s Robotics Institute, said that the team’s work represents a significant departure from previous approaches.
“The key difference to previous work is that the researchers created a map consisting of a set of images with their position uncertainty rather than just a set of images and their positions and orientation. Keeping track of the uncertainty has the advantage of allowing the use of previous images even if the robot doesn’t know exactly where it is and allows in improved planning,” he said.
“The researchers demonstrated impressive results avoiding obstacles and this work enables robots to quickly check for collisions. Fast flight among obstacles is a key capability that will allow better filming of action sequences, more efficient information gathering and other advances in the future.”