The U.S. Defense Advanced Research Projects Agency, more commonly known as DARPA, is developing a system intended to enable aerial drones to be able to choose what combat targets to engage by themselves, without input from human controllers. However, this may constitute a violation of the Geneva convention.

DARPA’s LAWS (Lethal Autonomous Weapons Systems) program would allow aerial drones to be able to autonomously differentiate between human targets that are considered threats and those that are not, independent of the human control that currently governs combat drones. The agency is also looking to develop what they call Collaborative Operations in Denied Environment (CODE), of which would allow autonomous drones to be able to co-ordinate maneuvers and attacks, again without the need for human input.

However, Stuart Russell, Professor of Computer Science, University of California, Berkeley, published an article in the journal Nature, where he points out that such systems might be in violation of the Geneva Convention. "Despite the limits imposed by physics, one can expect platforms deployed in the millions, the agility and lethality of which will leave humans utterly defenseless. This is not a desirable future," warns Russell.

The United Nations has held a series of meetings concerning DARPA’s LAWS program, and agree that the issue needs to be addressed.

News Source:
Dreamland Video podcast
To watch the FREE video version on YouTube, click here.

Subscribers, to watch the subscriber version of the video, first log in then click on Dreamland Subscriber-Only Video Podcast link.