House of Lords Special Committee on the Use of Artificial Intelligence in Weapon Systems
Origin of the Committee
On 7 December 2022, following increasing parliamentary interest and the work of a group of committed parliamentarians, the House of Lords Liaison Committee recommended the creation of a Special Committee on the Use of Artificial Intelligence in Weapon Systems to be established in 2023.
Each parliamentary year a total of four special inquiry select committees are established in the House of Lords, to examine and report on a range of issues and specialist subjects. They provide an important instrument to hold government accountable on key policy issues. The recommendation for a Special Committee on the Use of Artificial Intelligence in Weapon Systems – from a list of twenty submitted proposals – is recognition of the pressing need for greater scrutiny and clarity over the UK’s policy towards autonomous technology in a military context, something UNA-UK has long been calling for through our work as a Steering Committee member of the UK Campaign to Stop Killer Robots.
The 2017/19 Select Committee on Artificial Intelligence (AI), which examined the broader implications of advances in AI, noted that the military use of this technology merited a committee of its own. But, until now, no committee in either the Lords or the Commons has focused specifically on the military use of Artificial Intelligence.
UNA-UK is delighted with the development, which will translate to a significant uptick in parliamentary scrutiny on the issue of lethal autonomous weapons systems (LAWS). We specifically welcome the committee’s intended focus on the UK’s approach to both domestic and international policy in this area. The one-year initiative will allow for the much needed assessment of the present state of autonomous technology and the risks and prospects for deployment; an examination of existing international law in this area and the UK’s role in negotiations towards a new treaty to regulate autonomous weapons systems; and an in-depth assessment on the adequacy of the Ministry of Defence’s national strategy for the deployment of Al, as well as help formulate recommendations aimed at ensuring an ethical and responsible use of Al technology in weapon systems.
Also known as “killer robots”, LAWS are becoming a key issue of concern for international peace and security, with the UN Secretary-General describing machines capable of taking human life without human involvement as “politically unacceptable” and “morally repugnant”.
UNA-UK has been campaigning on this issue since 2013 and, as coordinator of the UK Campaign to Stop Killer Robots, we are calling for an international treaty to regulate and establish limits to the use of autonomous weapons systems. We hope the committee will provide an opportunity to interrogate the fundamental ethical question regarding our relationship with technology – of how far society ought to be prepared to go with respect to outsourcing military operations to algorithms, sensors, and autonomous technologies. We look forward to contributing to the important work this committee will undertake.
The fourth session of the Special Committee on Artificial Intelligence in Weapon Systems
On April 27, 2023, the fourth evidence session of the Artificial Intelligence Weapons inquiry took place. The focus was on gathering evidence regarding human rights and ethical concerns related to autonomous weapon systems. The session included Verity Coyle, Senior Campaigner and Advisor from Amnesty International, a member of the global Stop Killer Robots campaign; Mariarosaria Taddeo, Associate Professor and Senior Research Fellow at the Oxford Internet Institute, University of Oxford; and Alexander Blanchard, Digital Ethics Research Fellow at the Alan Turing Institute.
This marked the first participation of UK-based members of Stop Killer Robots in this forum, advocating for an international treaty to regulate and establish limits on the use of autonomous weapon systems. During the public evidence session, the expert witnesses provided their insights on the technical, legal, moral, and societal concerns surrounding autonomous weapons and advocating for action to address this significant emerging security threat.
UNA-UK hopes the Committee’s report will reflect the compelling arguments put forth which highlight the need for strict legislation to reign in the potential harm that AWS could cause. The most effective way to do this will be for the UK to work with the international community to urgently negotiate a binding legal framework including prohibitions and positive obligations to regulate these weapons.
Some key points participants raised in their evidence included:
Verity Coyle, Senior Campaigner and Advisor from Amnesty International, member of the UK Campaign to Stop Killer Robots:
The use of AWS, whether in armed conflict or in peacetime, implicates and threatens to undermine fundamental elements of international human rights law, including the right to life, the right to remedy, and the principle of human dignity. The threat to human dignity posed by delegating life and death determinations also presents ethical problems and raises concerns under the principle of humanity under IHL’s Martens clause.”
Mariarosaria Taddeo, Associate Professor and Senior Research Fellow at the Oxford Internet Institute, University of Oxford:
Autonomous weapon systems raise other serious human rights concerns outside of situations of armed conflict, threatening the right to life, the prohibition of torture and other cruel, inhumane or degrading treatment or punishment, and the right to security of person. We are currently undergoing a technological revolution in many fields driven by AI, machine learning, miniaturization, and automation..”
Alexander Blanchard, Digital Ethics Research Fellow at the Alan Turing Institute
In terms of whether the systems have been used, the United Nations identified, in a report, Kargu-2 by STM as the first use of an autonomous weapons system. As Verity also highlights, the problem is that it is not clear whether these are being used in an autonomous setting. What tends to happen is that an arms manufacturer creates a system and markets it as fully autonomous, because this is good for sales. It gets used and draws publicity, and the manufacturer then changes the specifications that it announced around that system to say, “This isn’t actually a fully autonomous system. It still has levels of human control”.
There is a real question there not only of definition but of transparency of the use. I am not sure how you address that, but it is about knowing whether this system was indeed used as a fully autonomous, or autonomous, weapons system.”
The recorded session is available here: https://parliamentlive.tv/event/index/441252bd-fb70-4907-b1c5-7e27014458bb
This inquiry accepted evidence after extending the deadline to 4.00 pm on Monday, 8 May 2023.
UNA-UK has been campaigning on this issue since 2013 and hosts the coordinator of the UK Campaign to Stop Killer Robots. Check our latest report on the links between autonomous weapons systems and the UK academic research sector.
Read more:
- UNA-UK welcomes new Special Committee on the use of AI in weapon systems
- UK Campaign to Stop Killer Robots website
- 33 states call for urgent negotiation of new international law to limit autonomous weapons
- Committee’s work and past witness sessions
This article is based on others first published by UNA-UK.