Alexandra Scott is standing on the legal front lines of the next, most terrifying chapter of human warfare. But if the Osgoode PhD student has her way, her research into autonomous weapons systems (AWS), artificial intelligence (AI), international legal structures and engineering ethics will make a meaningful contribution to debates and discussions aimed at banning such weapons.
“My main concern is that engineers are, in the process of designing AWS, making life-or-death decisions without a clear understanding of the laws governing armed conflict and their own professional obligations,” said Scott.
“My goal is to provide engineers with not only relevant legal knowledge,” she added, “but also an understanding of the greater social and political contexts in which they are developing AWS.”
Scott recently received a Dahdaleh Global Health Graduate Scholarship from York University’s Dahdaleh Institute for Global Health Research to support research for her dissertation, which is currently titled The Myth of “Good Enough”: Law, Engineering and Autonomous Weapons Systems. The scholarships are valued from $5,000 to $25,000 a year and are renewable for up to three years. Her PhD supervisor is Professor Saptarishi Bandopadhyay.
Autonomous weapons systems – also known as lethal autonomous weapons systems (LAWS) or killer robots – are any weapons that use AI to independently select a programmed target without direct human intervention. Because they may have the capacity to operate without human oversight, critics fear that the technology presents an even greater risk to non-combatants.
Scott, who has an engineering degree, a law degree and a master’s degree in law from Queen’s University, said her dissertation will focus particularly on the role of engineers in developing autonomous weapons systems. She said she will look at engineers through three legal filters: as technical specialists who can create autonomous weapons systems, as employees who are being directed by an employer to develop them, and as professionals who have ethical obligations beyond their employment contract.
As the title suggests, the dissertation will debunk the traditional notion in engineering that a product or project is good if it’s “good enough” and that perfection is unattainable.
“This idea that a fully autonomous weapon system operating without human guidance or oversight could be good enough is really troubling to me,” she said, “because then we have this gap between what humans are supposed to be afforded under international law and what a non-sentient autonomous weapon system is able to afford them, which is essentially nothing.”
Most of the UN’s member nations currently support an ongoing effort to ban autonomous weapons, but key players like the United States, the United Kingdom and Russia are opposed, said Scott.
She said she was excited to partner with the Dahdaleh Institute because her research aligns so well with its focus on global health and humanitarianism. When it comes specifically to civilians and civilian targets, she said, there is no acceptable level of collateral damage under international law.
“These weapons violate the broad value of human dignity,” she said, “and there’s nothing more fundamentally incompatible with health than someone who’s been unlawfully killed by an autonomous weapon system.”
At the Dahdaleh Institute, Scott is also serving as a graduate research assistant for Dr. James Orbinski, the institute’s director and the former international president of Médecins Sans Frontières (Doctors Without Borders).
“I feel incredibly fortunate to have been offered this opportunity,” she said of the Dahdaleh Graduate Scholarship. “It’s renewed my passion for why I wanted to do a PhD in the first place and has also given me a much sharper focus in where I want to take my dissertation over the next two years.”