WASHINGTON — The Defense Department will invest the $12 billion to $15 billion from its fFiscal yYear 2017 budget slotted for developing a Third Offset Strategy on several relatively small bets, hoping to produce game-changing technology, the vice chairman of the Joint Chiefs of Staff said.

Broadly speaking, key categories will be energy production and storage, lethal endgame technology, and software guidance and control, said US Air Force Gen. Paul Selva at a Thursday event hosted by the Brookings Institute.

"This is a learning space. Some of the investments we make won't to pay off. But we are going to place multiple small bets on places where we think we can make a difference, where we think the leverage of the technology exists to actually move the enterprise forward and look at the potential for a third offset," Selva said.

"The question we're trying to pose now is, 'Do the technologies that are being developed in the commercial sector principally provide the kind of force multipliers that we got when we combined tactical nuclear weapons [in the first offset] or precision and stealth [in the second offset]?' And if the answer is yes, then we can change the way that we fight in this battlespace," he said. If not, the military will likely seek to improve its current capabilities slightly to gain an edge over its adversaries.

Lethal endgame technologies, such as directed energy weapons or powered cannons firing hyper velocity rounds, could flip the economics of missile defense, he said. Rather than using an expensive, elegant weapon to intercept a ballistic or cruise missile, the US would be better off using lots of relatively inexpensive devices to counter a missile attack, forcing enemies to spend heavily to develop more sophisticated weapons, he said.

The US is on "the absolute wrong end of the cost-imposition curve," he said. "We're doing a $10 solution for a 10 cent problem. We need a 10 cent solution for a $10 problem."

There are several new technologies on the cusp of being implemented that have serious ethical implications, Selva noted during the wide-ranging discussion. The international community will need to establish accepted norms for biologic augmentation and embedding mechanical capability in humans, for example.

Additionally, artificial intelligence — machines that learn and react rather than robots that perform a pre-programmed task within defined parameters – poses an upcoming challenge.

"Artificial intelligence can help us with a lot of things that make warfighting faster, that make warfighting more predictable, that allow us to mine all of the data we have about an opponent to make better operational decisions," he said. "But I'm leaving none of those decisions at this moment to the machine."

Autonomous vehicles are already deployed in the field on and in the water, on land and in the air, he said.

"I might say to the weapon, go learn the signature. Once you've learned the signature, identify the target. That's about as far as I'm willing to go at this point," he said. "Once you've identified the target, a human has the responsibility to make the decision to prosecute the target."

But the military may soon have to decide whether it is willing to deploy unmanned, autonomous systems that can launch on an enemy.

"There are ethical implications, there are implications for the laws of war. There are implications for what I call 'The Terminator' Conundrum: What happens when that thing can inflict mortal harm and is empowered by artificial intelligence?" he said, referring to the science fiction movie featuring Arnold Schwarzenegger as a cyborg sent on a lethal search and destroy mission. "How are we going to know what is in the vehicle's mind, presuming for the moment that we are capable of creating a vehicle with a mind?"

One difficulty will be testing autonomous, learning systems to have the necessary degree of certainty that it will perform as intended, he said.

"In the [Defense] Department, we build machines and we test them until they break. You can't do that with an artificial intelligence, deep learning piece of software. We're going to have to figure out how to get the software to tell us what it's learned," he said.

But at what point can you trust that a learning machine will reach the desired conclusions and respond the way you want it to outside of a lab's controlled environment?

"Those are the problem sets I think we're going to have to deal with in the technology sector that make building the platform actually a relatively simple problem," he said.

With its collection of giant databases, the DoD has a requirement for deep learning systems to sort through big databases across the world, he said. Teaching coherent machines to advise humans has huge consequences, he said.

"The datasets that we deal with have gotten so large and so complex that if we don't have something to help sort them, we are just going to be buried in the data," Selva said. "If we can build a set of algorithms that allows a machine to learn what's normal in that space, and then highlight for an analyst what's different, it could change the way we predict the weather, it could change the way we plant crops. It can most certainly change the way we do change detection in a lethal battlespace."

Email: aclevenger@defensenews.com

Twitter: @andclev

Share:
More In Defense News