WASHINGTON ― As the United States, Russia and China continue to push forward in their development of unmanned autonomous weapon systems, questions surrounding how these new weapons will be governed and regulated are becoming more salient.

This week, parties to the Convention on Certain Conventional Weapons (CCW) will be meeting at the Hague to discuss the definition of “meaningful human control,” a term that is central to the ongoing regulation discussion.

But for some legal experts, the bigger question is “whether the international community as a whole will demand compliance with any legal developments in Geneva on autonomous weapons, or compliance with the existing law we already have that’s implicated with this new technology,” Mary Ellen O’Connor, professor of law at the Notre Dame Law School, said last Thursday during a keynote address at the Brookings Institution. “We have the UN charter and other principles restricting the use of military force, we have principles of international humanitarian law to govern combat on the battlefield and we have human rights law. It’s all relevant.”

While any new weapon system introduced to a battlefield must abide by the principles found in current law, LAWs present unique challenges regarding accountability. Jeroen van den Hoven, professor of ethics and technology at Delft University of Technology in The Netherlands, believes a more fine-tuned concept of responsibility is needed to truly address the issues posed by LAWS.

“We come to these subjects, which are very complex and dynamic, and bring a theory of a 2000 year old [concept] that has been with us with quite some time and apply it to a very new dynamic and very complex world,” van den Hoven said. “It is as if you want to compare a Swiss precision watch with a sledgehammer. You cannot do that, it wont work. It will never give you the right result”

Two accidents involving self driving cars in January demonstrate the brittleness of current autonomous technology and the regulatory challenges posed by such systems.

But when it comes to adjudicating behavior on the battlefield, for some the issue remains straightforward. “The law of war is well-established and whatever system we have, whether it is autonomous or human directed, it is going to have to comply with the fundamentals of the law of armed conflict, period.” said retired United States Air Force Major Charles Dunlap.

One issue with attempting to regulate new weapon systems is “inherently you are looking at a snap shot in time. ... There is a big risk in trying to capture a snapshot of time of technology and trying to ban it,” said Dunlap. He added that “what we really are talking about here are weapons that don’t exist.”

Yet the fact that machine learning autonomous weapon systems are not currently being fielded doesn’t mean they won’t be in the future.

So how should the international community prepare?

For Dunlap, a recognized testing and evaluation norm is needed to ensure that when these systems are deployed they behave in the ways battlefield commanders expect them to. “If we can’t come to that where we can reliably to say that the weapon is going to operate as intended, lawfully then we can’t field it,” he said.

But O’Connor remains optimistic about the possibility of progress in the legal space, not only this week in the Hague, but also when the group of governmental experts on LAWs meets again later this year August and then again when states party to the CCW meet in November.

“I believe they are actually going to come up with something. ... I think in November we are going to see some kind of limitation on autonomous weapons either in the form of a new protocol like the blinding laser protocol, or at least a declaration that brings meaningful human control into our legal thinking about weapons,” she said.

Daniel Cebul is an editorial fellow and general assignments writer for Defense News, C4ISRNET, Fifth Domain and Federal Times.

Share:
More In Unmanned