
One aspect of modern warfare which has gained growing attention in recent years is the role of unmanned aircraft in combat situations, and their substantial part in an international trend towards remote combat. British RAF remotely-controlled ‘Reapers’ (currently the RAF’s only armed unmanned aircraft), for instance, have used their weapons in Afghanistan 123 times in the first ten months of 2010. Earlier this year, David Cameron promised to increase the number of RAF Reapers in Afghanistan from four to nine, at an estimated cost of £135m. Indeed, today has seen The White House approve the use of missile-armed Predator drones to help Nato target Colonel Gaddafi’s forces in Libya – the planes are themselves flown via remote control by operators in the US.
Yet, whilst these so-called drone aircraft are not restricted in use to the military, it is their direct involvement in civilian deaths which has brought them increasingly under closer scrutiny. For many, the idea of a military vehicle controlled remotely thousands of miles away (the Reaper is similarly operated by RAF personnel based at Creech Air Force Base in Nevada, US) is deeply unsettling.
Furthermore, as Richard Norton-Taylor and Rob Evans’ recent article in The Guardian highlights, the development of armed autonomous machines raises a multitude of complex moral and legal issues. Indeed, as the geographical and psychological distance between soldier and target widens, the difficulties over moral responsibility and justification increase.
It is with this background of growing controversy over the use of unmanned aircraft that an internal report has been released by the Ministry of Defence, entitled ‘The UK Approach to Unmanned Aircraft Systems’; drawn up last month by the ministry’s internal thinktank, the Development, Concepts and Doctrine Centre (DCDC), based in Shrivenham, Wiltshire. The report acknowledges the range of complex legal and ethical issues that the employment of automated drone aircraft create. For example, as the report asks, “is a programmer guilty of a war crime if a system error leads to an illegal act? Where is the intent required for an accident to become a crime?”. For these reasons the report recommends that, “the UK must establish quickly a clear policy on what will constitute acceptable machine behaviour in future”.
In particular, the report acknowledges people’s concerns with the speed with which the technology is developing, and the likelihood that, as a consequence, armed systems will acquire more autonomy. As the report notes, there is a “school of thought that suggests that for war to be moral (as opposed to just legal) it must link the killing of enemies with an element of self-sacrifice, or at least risk to oneself”, automated drones, therefore, severely weaken any initial moral justification.
However, the MoD report admits that, “there is a danger that time is running out”, asking if, in fact, “debate and development of policy [are] even still possible, or is the technological genie already out of the ethical bottle, embarking us all on an incremental and involuntary journey towards a Terminator-like reality?”.
Related articles:
Computer Simulation and the Philosophy of Science
By Eric Winsberg, University of South Florida (September 2009)