Autonomous robots are on the horizon. The military’s creative minds are seeking new ways to remove direct human control from the ‘kill chain’. At least, that’s the conclusion I gleaned from investigating some of the U.S. Department of Defense’s recent publications. One of the biggest ‘push factors’ for autonomous drones is ‘swarming’ technology and strategy. While drones are increasing in size–think of the whale-like Global Hawk–DARPA and other weapons contractors are shrinking the size of drones to create highly mobile and cooperative swarms of ‘Nano’ drones that can be controlled with the point-and-click of a mouse (rather than direct control).
In 2010 the U.S. Army released the “US Army Unmanned Aircraft Systems Roadmap 2010-2035”. In it, the report describes the “swarming” tactic of micro-drones called “Nanos”:
“By 2025, Nanos will collaborate with one another to create swarms of Nanos that can cover large outdoor and indoor areas. The swarms will have a level of autonomy and self-awareness that will allow them to shift formations in order to maximize coverage and cover down on dead spots. Nanos will possess the ability to ﬂy, crawl, adjust their positions, and navigate increasingly conﬁned spaces”.
On the 19th of November 2012 Human Rights Watch released a 50-page report called ‘Losing Humanity: The Case Against Killer Robots‘. This is the first major report on autonomous drones outside of the beltway, and is jointly published by the Harvard Law School International Human Rights Clinic.
According to the press release, the report explores the moral and legal issues surrounding autonomous weapons systems. Much of the concern is centred around the delegation of responsibility and judgement from humans to robots. “Giving machines the power to decide who lives and dies on the battlefield would take technology too far,” said Steve Goose, Arms Division director at Human Rights Watch. “Human control of robotic warfare is essential to minimizing civilian deaths and injuries.”
The report calls for an international treaty that would ban the development, production, and use of autonomous weapons before it’s too late. Fully automomous technologies would not meet the requirements of international humanitarian law according to Human Rights Watch and the Harvard clinic because they would not be able to:
(a) distinguish whether an attack targets soldiers or civilians
(b) decide whether an attack is proportional (i.e. whether civilian ‘collateral’ outweighs military advantage’.
Other disadvantages of killer robots include the following:
- Autonomous drones could also lower the threshold of going to war [drones already are, after all].
- Responsibly is so decentralized and shared among commanders, programmers, operators, and manufacturers, it would create an ‘accountability gap’. This may remove the important checks and balances needed for ensuring that state violence adheres to international law.
While we wait for this brave new droneworld, consider a scene from the Tom Cruise science-fiction film, Minority Report: tiny, spider-like robots infiltrate an apartment building searching for the protagonist.