Nov. 28, 2022

Autonomous weapons in warfare will become more common: UCalgary expert

Maureen Hiebert says the main debate will surround how much human control should be involved with these weapons
Autonomous and semi-autonomous weapons systems, like drones, will become more and more common in warfare as the 21st century progresses
Autonomous and semi-autonomous weapons systems, like drones, will become more and more common in warfare as the 21st century progresses. Colourbox

If you’ve seen the news lately about Russia’s bombardment of Ukraine with drones, or the videos of the Boston Dynamics robot dog with a machine gun welded to its back, then you know the time of robots in warfare is upon us.

According to Dr. Maureen Hiebert, PhD, an associate professor in the Department of Political Science at the University of Calgary, many countries, including the United States, Russia, and China, have well-developed plans around the increased use of artificial intelligence, machine learning, robots and other autonomous systems in combat.

“The debate isn’t so much about whether this technology will be used militarily — that has already been effectively decided and planned for,” says Hiebert, who is also the acting director and graduate program director for the Centre for Military, Security, and Strategic Studies.

“It’s more a matter of the level of autonomy and how much human control there will be over these systems.”

'Different form of warfare' than we've ever seen

Hiebert says there is reasonable concern about the development of technologies that use lethal force where the entire decision-making process about whether to use that force belongs to the machine.

“Then I think we’re in a completely different form of warfare we’ve never seen before,” she says.

You’d have a situation where, for the very first time, the tools of war become combatants themselves.

A secondary concern people have raised is the decisions made by autonomous systems are built upon complex sets of data and machine learning, so humans can only understand the outcome and not the decision-making process.

For these reasons, efforts have been made by the International Committee of the Red Cross for states to impose legally binding rules on the use of autonomous weapons. However, Hiebert says these efforts have floundered as war-fighting states are the least keen on having rules put in place regarding the level of human control.

This is because in combat situations any semi-autonomous system with more levels of human control requires communication between human and machine, which is then prone to enemy jamming or hacking.

If you’re thinking that humanity itself could band together to prevent the rise of “killer robots,” not so fast.

Autonomous technology development on parallel tracks

Hiebert says that because many of the advances in areas like AI and machine learning are coming from the civilian sector, we are already familiar with the technology and use it in our day-to-day lives.

In research she is working on, she imagines autonomous technology development along two parallel tracks. On one, there will be the civilian world and uses of technology we will be familiar with, and on the other track will be the military applications of the same technology.

“Already this kind of technology is so suffused in how we carry out our day-to-day lives that stigma won’t be there because we’re going to be using exactly the same kind of technology,” Hiebert says.

While the combatants may be robotic, the determining factor of who wins will still be very human. Hiebert says the foundational idea behind winning a war is you have eliminated your opponent’s ability and willingness to continue to fight, so it will still ultimately come down to human commanders and politicians on whether the conflict will continue.

“That dynamic and logic of warfare will probably remain the same,” she says.

Use of force could be seen as 'clean and safe'

Instead, the concern should lie with the prohibition on the use of force, which is part of international law except in cases of individual and state self-defence, becoming less binding. Hiebert says state and non-state actors may be more willing to use force against each other because the human cost to their own side is lessened when using autonomous fighters.

“There may be a greater willingness to resort to the use of force because it’s not going to have the same bad outcomes as traditional warfare,” she says.

Hiebert points out that proponents of autonomous weapons say these systems will only become more precise as well, so civilians will be more safeguarded on the battlefield.

However, she says this is also concerning because that may further loosen the prohibition on the use of force because it’s seen as clean and safe.

“You could have civilians going about their daily lives and over in the corner a bunch of autonomous systems could be duking it out,” says Hiebert.

“If we start thinking of warfare like that it becomes the gamification of war. I think that’s very troubling.”


Sign up for UToday

Sign up for UToday

Delivered to your inbox — a daily roundup of news and events from across the University of Calgary's 14 faculties and dozens of units

Thank you for your submission.