In many fields of robotics self-learning systems like neural networks are eschewed as they are non-deterministic and might react unpredictably. Do you think this will be different in a military context, where „collateral damage“ is accepted as a necessary evil and already the definition of an enemy has become „was at the targeted kill zone when the drone-fired rocket exploded“?
War tilts social norms on their head (one reason for its damaging psychological effects). However, the emotional distance autonomous drones provide might make indiscriminate killing easier. Once deployed, it will be very difficult for the public (or even commanders) to discern mistakes from intentional targeting. The soldiers who sent the drone might not be well-versed in its internal design, merely in it’s operation or maintenance, and they might never see video from its gun cameras. Moreover, a drone’s firmware is unlikely to be called in to the Hague to faces charges for war crimes. Who would be to blame if things go wrong? The drone manufacturer? The purchaser? The system operator? Such diffusion of blame has historically been a source of trouble. Would a firmware update be sufficient assurance that the ’problem’ was solved?
Most national militaries seek to avoid civilian casualties wherever possible (due to political blowback if nothing else), but unless autonomous drones start killing friendly forces or civilians in significant numbers, I think errors will be tolerated, especially in secret conflicts. This is why checks and balances and reasonable transparency must be included in any drone defense program.
The current public reaction to drone warfare (especially in the US) seems to be largely positive. No friendly soldiers are put at risk and abstract concepts like the sovereignty of foreign countries seems to be irrelevant. Do you expect this to change?
The American public’s approval ratings for drones is likely to change the day drones attack the U.S.. Then you’ll see one of two things: either the public demands international cooperation to root out illicit drone manufacturers and establish a legal framework for drone use in defense-only applications (unlikely), OR (much more likely) an escalation - a free-for-all arms race of robotic weaponry. And unlike nuclear, chemical, or biological weapons, these weapons will almost undoubtedly be used - especially with the technology spreading on black markets. If cooler heads do not prevail and government leaders overreact, the losers will be civil society.
The technology needed to build drones seems to be unstoppable. Consumer toy drones already contain enough processing power to do image recognition. Stable autonomous flight can be achieved with the processors and sensors contained in today’s smartphones. What do you propose to avoid a detoriation into a world like you describe in Kill Decison?
Civilian drones will have many beneficial uses (search & rescue, environmental monitoring, exploration, and much more). Drones and other robotics are indeed here to stay, and they will become cheaper, more ubiquitous, and more capable in pace with the general improvements in processing power, memory, and sensors.