Source: University of Canterbury – statements
First there was gunpowder, then nuclear weapons. Now another revolution in warfighting is under way with the rise of algorithmic warfare using robots and precision targeting technology. Their use is highly controversial but to date there has been no systematic attempt made to map the debate unfolding in academic, intergovernmental, corporate and social media domains.
Internationally, the use of lethal autonomous weapons (LAWS) or so-called ‘killer robots’ is being debated at the highest levels. A new University of Canterbury (UC) study, which has recently received a generous grant from the 2019 Marsden Fund/Te Pūtea Rangahau a Marsden, aims to map and analyse this fast-evolving debate. Its findings could help shape the development of effective and ethical regulation of these sophisticated weapons.
Dr Jeremy Moses and Associate Professor Amy Fletcher, from the Department of Political Science and International Relations at UC, and Dr Geoff Ford, a political scientist with the UC Arts Digital Lab, aim to delve deeply into this issue in a three-year project starting this year. Their research has attracted a prestigious 2019 Marsden Fund grant of $842,000.
Around 12 countries are known to be developing LAWS, among them the United States, China, Russia and Israel. While the perception in those nations is that LAWS are needed to maintain geopolitical balance, groups such as the Campaign to Ban Killer Robots and other international bodies are seeking a global ban on their use.
“Proponents argue that autonomous weapons could decrease civilian casualties through enhanced targeting precision and reduce the risks for human soldiers,” Associate Professor Fletcher says. “Opponents fear a world of ‘algorithmic warfare’ in which robots can make decisions to kill in the absence of human oversight and in which the speed and complexity of war accelerates to the point that international rules of conduct are rendered irrelevant.”
This project will apply innovative text-mining tools, some of which have been developed by Dr Ford, to analyse the debate with the goal of producing comprehensive and insightful data that could better inform regulators and decision-makers.
As Dr Jeremy Moses notes, “virtually all nation-states, including New Zealand, must now grapple with the implications of algorithmic warfare and the ethics and lethality of autonomous weapons.”
This is one of 12 UC-led research projects to have received funding from the 2019 Marsden Fund/Te Pūtea Rangahau a Marsden. A total of $6.54 million was awarded across four of UC’s five colleges. A rigorous selection process was used in awarding the grants and selecting projects of the highest international quality. The grants recognise UC as a world-class research-led teaching and learning university.
All media enquiries are directed to the UC Communications team.