Navigation
Join our brand new verified AMN Telegram channel and get important news uncensored!
  •  

Pentagon’s ‘killer robots’ may cause mass destruction, advocacy groups warn

The Army's artificial intelligence software prototype designed to quickly identify threats through a range of battlefield data and satellite imagery. (Photo Screenshot image/US Army)
September 28, 2023

The Pentagon’s push for fully autonomous weapons systems and drones has led to concerns from critics that autonomous “killer robots” will increase the risk of civilian casualties, mass destruction and other issues.

The Pentagon is currently working with the military technology industry on the Replicator initiative, which is intended to develop autonomous systems of water vessels, drones, aircraft, and defense systems that can be synchronized through computer software.

“We’ve set a big goal for replicator: to field attritable, autonomous systems at a scale of multiple thousands [and] in multiple domains within the next 18-to-24 months,” Deputy Secretary of Defense Kathleen Hicks said recently.

While the Pentagon has highlighted the potential advantages of autonomous drones and weapons systems, The Hill reported that many critics and advocates of arms control have claimed that the existing limitations on autonomous weapons are insufficient to provide protection against the increased risks associated with the development of fully autonomous weapons systems.

According to The Hill, critics have described autonomous weapons that are powered by artificial intelligence and capable of operating without human aid as “killer robots” and “slaughterbots.”

“It’s really a Pandora’s box that we’re starting to see open, and it will be very hard to go back,” Anna Hehir, leader of autonomous weapons system research for Future of Life Institute, an advocacy group, said. “I would argue for the Pentagon to view the use of AI in military use as on par with the start of the nuclear era. So, this is a novel technology that we don’t understand. And if we view this in an arms raceway, which is what the Pentagon is doing, then we can head to global catastrophe.”

READ MORE: Pentagon creating ‘robot’ language so drones can communicate without humans

Hicks announced the Replicator initiative during an August defense conference, describing it as a “game-changing” plan that would allow the United States to counter the growth of China’s military. While autonomous weapons systems have been used in different forms for years, the Replicator Initiative aims to develop large quantities of drones and other weapons systems that can be powered by artificial intelligence.

According to The Hill, Hicks has claimed that the Replicator initiative will follow the ethical guidelines that have been established for autonomous weapons systems. The guidelines, established by the Pentagon’s artificial intelligence directives, were updated in January, highlighting the importance of senior-level commanders reviewing and approving autonomous weapons systems.

While the Pentagon’s guidelines require an “appropriate level of human judgment” to be exercised before AI weapons can be used in warfare, a Congressional Research Service report claimed that an “appropriate level of human judgment” was a “flexible term” that would not be applicable in every situation. Additionally, the report found that another guideline requiring “human judgment over the use of force” does not actually require the direct control of humans.

Michael Klare, secretary of the Arms Control Association’s board of directors, has expressed doubts that “it will always be possible to retain human control over all of these devices.” According to The Hill, Klare fears that autonomous weapons systems could conduct unauthorized missions, such as carrying out attacks against nuclear facilities.

“The multiplication of these kinds of autonomous devices will increase the potential for unintended or accidental escalation of conflict across the nuclear threshold [that could] trigger a nuclear war,” Klare said. “We fear that there’s a lot of potential for that down the road. And that’s not being given careful consideration by the people who are fielding these weapons.”

Despite concerns being raised by critics of autonomous weapons systems, Eric Pahon, a Department of Defense spokesperson, has claimed the United States is the “world leader in ethical AI standards.”

“We’re always going to have a person responsible for making decisions,” Pahon said. “We’re committed to having a human responsible for decision-making.”