Blog

Jun 13, 2020

Are AI-Powered Killer Robots Inevitable?

Posted by in categories: drones, military, nuclear weapons, robotics/AI, singularity

Autonomous weapons present some unique challenges to regulation. They can’t be observed and quantified in quite the same way as, say, a 1.5-megaton nuclear warhead. Just what constitutes autonomy, and how much of it should be allowed? How do you distinguish an adversary’s remotely piloted drone from one equipped with Terminator software? Unless security analysts can find satisfactory answers to these questions and China, Russia, and the US can decide on mutually agreeable limits, the march of automation will continue. And whichever way the major powers lead, the rest of the world will inevitably follow.


Military scholars warn of a “battlefield singularity,” a point at which humans can no longer keep up with the pace of conflict.

Comments are closed.