A drone from the University of Zurich is an engineering and technical marvel. It also moves slower than someone taking a Sunday morning jog.
At the International Conference on Intelligent Robots and Systems in Madrid last October, the autonomous drone, which navigates using artificial intelligence, raced through a complicated series of turns and gates, buzzing and moving like a determined and oversized bumblebee. It bobbed to duck under a bar that swooshed like a clock hand, yawed left, pitched forward and raced toward the finish line. The drone, small and covered in sensors, demolished the competition, blazing through the course twice as fast as its nearest competitor. Its top speed: 5.6 miles per hour.
A few weeks earlier, in Jeddah, Saudi Arabia, a different drone, flown remotely by its pilot, Paul Nurkkala, shot through a gate at the top of a 131-foot-high tower, inverted into a roll and then dove toward the earth. Competitors trailed behind or crashed into pieces along the course, but this one swerved and corkscrewed through two twin arches, hit a straightaway and then blasted into the netting that served as the finish line for the Drone Racing League’s world championship. The winning drone, a league-standard Racer3, reached speeds over 90 miles per hour, but it needed a human to guide it. Mr. Nurkkala, known to fans as Nurk, wore a pair of goggles that beamed him a first-person view of his drone as he flew it.
Artificial intelligence, or A.I., has been on a hot streak, besting humans in competitions over the past five years. AlphaGo, a program built by DeepMind, the artificial intelligence arm of Google parent Alphabet, went from learning the basics of the game Go to beating the world’s best human player in a little over three years. More recently, the A.I. AlphaStar, also by DeepMind, was able to beat a top player in the complex strategy video game “StarCraft II,” shutting out its human competitor five games to zero.
But the real world can be an immensely noisy place, and many A.I.-powered, and autonomous, vehicles still struggle to excel in it.
In 2017, NASA’s Jet Propulsion Laboratory built three autonomous drones and pitted them in a race against Ken Loo, an expert drone pilot. He easily beat them all. Darpa’s Fast Lightweight Autonomy program has been able to send drones through tight hallways at 45 miles per hour, faster than the one from the University of Zurich, but in a less complex setting and slower than a human pilot. A piloted drone swooping and arcing around a course while an autonomous drone hesitantly chugs through space is the difference between an N.B.A. point guard driving toward the basket and a toddler learning to walk.
Onboard computers will get more powerful. Algorithms for developing optimal flight paths will become optimized. New image processing techniques will shrink the time it takes for a computer to understand what it sees from milliseconds to microseconds, while the human eye will always have 13 milliseconds of latency in processing visual stimuli. But that’s in the future. Right now, an autonomous drone completing a racecourse at a speed faster than 5.6 miles per hour will be an accomplishment.
This year, a new competition will try to make sure autonomous drones are more nimble — and that they are truly able to act by themselves.
“Right now, autonomous drones are a thing you’d only find in labs, being pioneered by a small, niche audience,” said Keith Lynn, Lockheed Martin’s program manager for AlphaPilot, an autonomous drone racing competition organized by the Drone Racing League.
The AlphaPilot competition, which is sponsored by Lockheed Martin and part of the racing league’s new Artificial Intelligence Robotic Racing Circuit, aims to drive interest and research into self-driving, or autonomous, vehicles. Nine teams will compete this fall, out of 430 currently making their way through qualifying rounds — students, A.I. researchers and independent drone enthusiasts, among others — according to the organizers. The winning team will take home a prize of $1 million. If the A.I. drone can also beat a top human pilot in a head-to-head race, the team will get an extra $250,000.
Competitors, the league said, will program a Racer3 that includes an artificial intelligence chip made by Nvidia, a partner in the competition. The nine qualifying teams will be announced this spring, according to the organizers.
For autonomous drones to be useful — in disaster zones, as delivery vehicles or in rural areas — they will need to be able to fly far, fast and without human oversight, often in environments where they can’t rely on external guidance systems, like GPS. And that’s one of their biggest challenges.
“Current autonomous drones have very little onboard decision-making,” said Kerry Snyder, a founder of KEF Robotics, a competing team from Pittsburgh. “They will almost always be following very specific human commands and rarely be able to accomplish higher level tasks such as ‘Find a trapped person’ or ‘Fly through an open window and then explore.’”
There’s also a gap between code created in the lab and real-world flying. “A lot of our A.I. is primarily developed in simulation,” said Dr. Chelsea Sabo, technical lead of the AlphaPilot program at Lockheed Martin. “Going from simulation to the real world is going to be a big challenge in AlphaPilot.”
Machine learning allows for A.I. to train much faster than flesh-and-blood pilots. A human who wants to practice for 10,000 hours needs three-and-a-half years of eight-hour days. A computer using machine learning can fly 10,000 simulated hours overnight.
Eric Amoroso, another co-founder of KEF Robotics, said that autonomous vehicles can also be more precise than human pilots. Autonomous drones can fly more precisely by making subtle alterations to how much thrust each propeller produces, for example, he said, and can use information from sensors that humans don’t have, like an accelerometer, to estimate where it is in space.
But where humans excel — and A.I. in general falls short — is in merging those individual skills into a cohesive whole, and doing it as fast as humans are capable. “Sensing the world, making decisions, acting on it, and doing that in real-time, that’s really the fundamental challenge of robotics,” Mr. Amoroso said.
Autonomous drones also struggle to make sense of visual information, particularly at high speeds, in part because of shortcomings in sensors. “The main challenge of autonomous drone flight is perception based on cameras,” said Davide Scaramuzza, professor of robotics and perception at the University of Zurich and the creator of the autonomous drone that won the competition in Madrid last year. “The faster the drone goes, the more blurred the image gets.”
A drone flying in the AlphaPilot competition will only be able to fly based on what it can see in front of it, and must use that information to know where it is in physical space. At 90 miles per hour, even at the calculating speed computers are capable of, an autonomous drone won’t be able to process images as fast as a human can, and may be thrown off course by something as simple as a shadow, leading it to miss a gate or believe it’s one foot to the right of where it actually is. At best, this means the autonomous drone will need to course correct, slowing it down. At worst, it crashes.
There’s also the issue of strategy. “A pilot with thousands of hours of flight time isn’t just thinking about the gate in front of them,” Mr. Nurkkala, the D.R.L. champion, said. “They’re thinking about the next five or six gates, and how to position themselves to keep the best racing line.” Inside a simulator, an AlphaPilot drone might be able to make those same sorts of tactical and strategic plans. On a real-world course moving at speed, it may just be doing its best to hit each gate.
And going fast is a calculation of risk versus reward; if you’re flying without ever crashing, Mr. Nurkkala said, you’re not flying as fast as possible. For teams programming AlphaPilot drones, this will mean training those drones to make trade-offs of when to speed up and when to play it safe — decisions humans often make instinctually. “In our simulations, we can tune our system so that it finishes the course 100 percent of the time, but it flies pretty slow,” Mr. Snyder said. “Or we can tune it to finish the course 20 percent of the time, but it flies much faster.” The winner of the AlphaPilot competition — and of its $1 million grand prize — will likely be the team that not only optimizes its algorithms, but also take the smartest risks.
And if an autonomous drone can outrace a human-controlled one? “In my opinion, it would be very similar to Deep Blue or AlphaGo,” Mr. Snyder said, referring to the chess-playing computer and the DeepMind A.I. “It represents a major step in technical progress and innovation, even if in a limited environment where an autonomous robot could realistically outperform a human. That’s in contrast the infinite and fractal complexity of self-driving cars.”
Mr. Lynn is more effusive: “Everyone will know the programmers behind that team. They’ll be celebrities. Companies will want to endorse them. Kids will aspire to be them. That’s exciting from a STEM perspective.”
And Mr. Lynn believes professional pilots will also take notice. “I think you’ll also see the human-piloted D.R.L. races change, as pilots learn from the efficiencies that only a computer and code can hone in on, and approach maneuvers in entirely new ways. That’s exciting for the future of human and machine teamwork.”