WASHINGTON ― Human fighter pilots, your jobs are safe for now.
Weeks after an artificial intelligence algorithm defeated a human pilot in a simulated dogfight between F-16 jets, the Pentagon’s director of research and engineering for modernization said Thursday at the Defense News Conference that it’s more likely an AI will team with military pilots than replace them.
“I don’t see human pilots being phased out, I see them being enhanced, not physically, but I see their work, their effectiveness being enhanced by cooperation with artificial intelligence systems,” said Mark Lewis, who also serves as the acting deputy undersecretary of defense for research and engineering.
The AlphaDogfight Trials in August marked the finale of the Pentagon research agency’s AI air combat competition. The now-notorious algorithm, developed by Heron Systems, easily defeated the fighter pilot in all five rounds that capped off a yearlong competition hosted by the Defense Advanced Research Projects Agency ― which is overseen by Lewis and the Defense Department’s research and engineering shop.
“The key takeaway from that was the artificial intelligence system did so well because it wasn’t so concerned about self-preservation, it was willing to do things that a human pilot wouldn’t do. And that’s the advantage of artificial intelligence,” Lewis said. “I think the real answer is teaming AI with a human for the best combination of both. So I’m pretty confident we’re going to have human pilots into the future.”
The AlphaDogfight Trials were a subset of the Air Combat Evolution program, or ACE, which is one of a few DARPA efforts exploring human-machine teaming, agency spokesman Jared Adams said in an email.
ACE is using human-machine collaborative dogfighting to increase trust in combat autonomy, and the goal is for it to scale to more complex multi-aircraft scenarios to pave the way for live, campaign-level experimentation.
Fiscal 2023 will see the first in a yearlong series of trials using tactical fighter-class aircraft (currently L-39 trainers), with safety pilots on board to assist in case of trouble. Those pilots would be given “higher cognitive level battle management tasks while their aircraft fly dogfights,” all while sensors gauge the pilot’s attention, stress and trust in the AI, Adams said.
DARPA foresees a single human pilot serving as a mission commander in a manned aircraft, orchestrating multiple autonomous, unmanned platforms that would all be engaged in individual tactics. ACE would ultimately deliver that capability.
“ACE, therefore, seeks to create a hierarchical framework for autonomy in which higher-level cognitive functions (e.g., developing an overall engagement strategy, selecting and prioritizing targets, determining best weapon or effect, etc.) may be performed by a human, while lower-level functions (i.e., details of aircraft maneuver and engagement tactics) is left to the autonomous system,” Adams said.
“In order for this to be possible, the pilot must be able to trust the autonomy to conduct complex combat behaviors in scenarios such as the within-visual-range dogfight before progressing to beyond-visual-range engagements.”
In announcing the future trials using tactical aircraft on Wednesday, Defense Secretary Mark Esper said: “AI’s role in our lethality is to support human decision-makers, not replace them.”
“We see AI as a tool to free up resources, time and manpower so our people can focus on higher-priority tasks and arrive at the decision point, whether in a lab or on the battlefield, faster and more precise than the competition,” he added.
But Esper warned that both Russia and China were pursuing fully autonomous systems, and drew a distinction between them and what he described as the U.S. military’s ethically guided approach to AI.
“At this moment, Chinese weapons manufacturers are selling autonomous drones they claim can conduct lethal targeted strikes,” he said. “Meanwhile, the Chinese government is advancing the development of next-generation stealth UAVs, which they are preparing to export internationally.”
Andrew Eversden contributed to this report.