Army Robots Won't Use Lethal Force, But Will Still Increase Lethality

Army Robots Won't Use Lethal Force, But Will Still Increase Lethality

While humans are needed to make decisions regarding the use of lethal force, there are many situations where weapons might be needed for purely defensive or non-lethal decisions.

 

Army robots will be armed with weapons, but don’t expect a real-life Terminator anytime soon. Department of Defense policy requires decisions regarding the use of lethal force to be made by humans.

This is done for ethical and tactical reasons, as machines are poorly-suited to understand all the variables involved in using lethal force. The unique attributes of human reasoning, intuition, feeling, and intent cannot be replicated by a mathematically-driven robot. 

 

However, some calculations, functions, data analysis and information organization which artificial intelligence- (AI) enabled machines can perform. Robot technology will likely see huge leaps in autonomy in coming years. As algorithms improve, robots and unmanned systems will take on more responsibility and decision-making authority and reduce the cognitive load placed upon human decision-makers. Amy leaders believe the optimal strategy involves a careful blend of human cognition and AI-enabled unmanned autonomy. This is the foundation of manned-unmanned teaming. The idea is to leverage the best of each to complement one another. Computers are well-suited to handle information processing, sensor integration, real-time analytics, non-lethal decision-making, problem-solving, and sensor-to-shooter pairing. But computers can’t compare to human thought, feeling, and some kinds of dynamic problem-solving.

These variables inform the Army’s progress with its Robotic Combat Vehicles (RCV) program. The program expects to refine human-machine interaction and connectivity through upcoming soldier-robot experiments called “soldier touchpoints.”

The Army is now surging forward with testing and experimentation with its RCV-Medium effort. All of the offerings under consideration are armed, yet with the understanding that humans will operate in a command-and-control capacity when it comes to the use of lethal force. 

“There will be a radio that connects the user with the robot, and the (graphical user interface), the user interface between the human and the robot,” Maj. Gen. Ross Coffman, Director, Next-Generation Combat Vehicle Cross Functional Team, Army Futures Command, told The National Interest in an interview. “What we're really trying to get right is to make sure that the robot can interface with the human so that we have the lowest cognitive load.”

The goal is to ensure that the most crucial human-machine interfaces are preserved and secured while robots and unmanned systems free up human decision-makers. Commanders at war will not need to divert time and energy away from those crucial variables, nuances and problems best addressed by humans.

“The autonomy reduces the number of human interfaces with the robot so they can actually perform on the battlefield in such a way that humans are less encumbered with decisions moving forward,” Coffman said.

Also, while humans are needed to make decisions regarding the use of lethal force, there are still many situations where weapons might be needed for purely defensive or non-lethal decisions. AI-enabled computing, for example, might be able to identify and intercept incoming enemy munitions such as rockets, mortar fire, or artillery without requiring human direction. Pentagon reports say these kinds of scenarios are now being described as an “out-of-the-loop” kind of scenario, referring to possible autonomous use of weapons which don’t involve lethal force. 

Kris Osborn is the defense editor for the National Interest. Osborn previously served at the Pentagon as a Highly Qualified Expert with the Office of the Assistant Secretary of the Army—Acquisition, Logistics & Technology. Osborn has also worked as an anchor and on-air military specialist at national TV networks. He has appeared as a guest military expert on Fox News, MSNBC, The Military Channel, and The History Channel. He also has a Master’s Degree in Comparative Literature from Columbia University.  

Image: Reuters.