Author: Tronserve admin
Wednesday 28th July 2021 12:18 PM
To Fly Solo, Racing Drones Have a Need for AI Speed Training
Drone racing’s ultimate vision of quadcopters weaving nimbly through obstacle courses has attracted far less excitement and investment than self-driving cars targeted at reshaping ground transportation. But the U.S. military and defense industry are betting on autonomous drone racing as the next frontier for developing AI so that it can deal with high-speed navigation within tight spaces without human interference.
The autonomous drone challenge demands split-second choice-making with six degrees of freedom instead of a car’s mere two degrees of road freedom. One research team developing the AI required for controlling autonomous racing drones is the Robotics and Perception Group at the University of Zurich in Switzerland. In late May, the Swiss researchers were among nine teams revealed to be contending in the two-year AlphaPilot open innovation challenge sponsored by U.S. aerospace company Lockheed Martin. The winning team will walk away with up to $2.25 million for defeating other autonomous racing drones and a skilled human drone pilot in head-to-head competitions.
“I think it is crucial to first point out that choosing an autonomous drone to complete a racing track at high speeds or even beating a human pilot does not imply that we can have autonomous drones [capable of] navigating in real-world, complex, unstructured, unknown environments such as disaster zones, collapsed buildings, caves, tunnels or narrow pipes, forests, military scenarios, and so on,” says Davide Scaramuzza, a professor of robotics and perception at the University of Zurich and ETH Zurich. “However, the robust and computationally efficient state approximation algorithms, control, and planning algorithms formulated for autonomous drone racing would represent a starting point.”
The nine teams that made the cut—from a pool of 424 AlphaPilot applicants—will compete in four 2019 racing events organized under the Drone Racing League’s Artificial Intelligence Robotic Racing Circuit, says Keith Lynn, program manager for AlphaPilot at Lockheed Martin. To guarantee an apples-to-apples comparison of each team’s AI secret sauce, each AlphaPilot team will upload its AI code into equivalent, specially-built drones that hold the NVIDIA Xavier GPU at the core of the onboard computing hardware.
“Lockheed Martin is providing mentorship to the nine AlphaPilot teams to support their AI tech development and innovations,” says Lynn. The company “will be hosting a week-long Developers Summit at MIT in July, committed to workshopping and improving AlphaPilot teams’ code,” he added. He notes that each team will hold the intellectual property rights to its AI code.
The AlphaPilot challenge takes inspiration from older autonomous drone racing events hosted by academic researchers, Scaramuzza says. He credits Hyungpil Moon, a professor of robotics and mechanical engineering at Sungkyunkwan University in South Korea, for having organized the annual autonomous drone racing competition at the International Conference on Intelligent Robots and Systems since 2016.
It’s no easy task to build and train AI that can perform high-speed flight through advanced environments by relying on visual navigation. One large challenge comes from how drones can accelerate sharply, take sharp turns, fly sideways, do zig-zag patterns and even perform back flips. That means camera images can suddenly appear tilted or even upside down during drone flight. Motion blur may occur when a drone flies very close to structures at high speeds and camera pixels gather light from multiple directions. Both cameras and visual software can also challenge to compensate for sudden changes between light and dark parts of an environment.
To lend AI a helping hand, Scaramuzza’s group recently exhibited a drone racing dataset that contains realistic training data taken from a drone flown by a pro pilot in both indoor and outdoor spaces. The data, which includes difficult aerial maneuvers such as back flips, flight sequences that cover hundreds of meters, and flight speeds of up to 83 kilometers per hour, was offered at the 2019 IEEE International Conference on Robotics and Automation.
The drone racing dataset also consists of data grabbed by the group’s special bioinspired event cameras that can detect changes in motion on a per-pixel basis within microseconds. By comparison, standard cameras need milliseconds (each millisecond being 1,000 microseconds) to compare motion changes in each image frame. The event cameras have already proven capable of helping drones nimbly dodge soccer balls thrown at them by the Swiss lab’s researchers.
The Swiss group’s work on the racing drone dataset obtained funding in part from the U.S. Defense Advanced Research Projects Agency (DARPA), which acts as the U.S. military’s special R&D arm for more innovative projects. Especially, the funding came from DARPA’s Fast Lightweight Autonomy program that envisions tiny autonomous drones capable of flying at high speeds through cluttered environments without GPS guidance or communication with human pilots.
Such speedy drones could serve as military scouts checking out hazardous buildings or alleys. They could also someday help search-and-rescue teams find people trapped in semi-collapsed buildings or lost in the woods. Being able to fly at high speed without crashing into things also makes a drone more efficient at all sorts of tasks by making the most of limited battery life, Scaramuzza says. After all, most drone battery life gets used up by the need to hover in flight and doesn’t get emptied much by flying faster.
Even if AI manages to defeat the drone racing obstacle courses, that would be the end of the beginning of the technology’s development. What would still be involved? Scaramuzza specifically singled out the need to handle low-visibility conditions involving smoke, dust, fog, rain, snow, fire, hail, as some of the biggest challenges for vision-based algorithms and AI in elaborate real-life environments.
“I think we should develop and release datasets containing smoke, dust, fog, rain, fire, etc. if we require to allow using autonomous robots to complement human rescuers in saving people lives after an earthquake or natural disaster in the future,” Scaramuzza says.
This article is originally posted on IEEESPECTRUM.com
This article is originally posted on IEEESPECTRUM.com