Does the U.S. Navy's Next Super Weapon Have a Fatal Flaw?
A new study on naval drones warns the real problem with autonomous drones isn’t going berserk, but rather the inability to adapt to the unexpected.
Autonomous combat drones stoke fears of killer robots run amok, driven by some flawed or hacked AI convinced that that yellow school bus is a tank.
But a new study on naval drones warns the real problem with autonomous drones isn’t going berserk, but rather the inability to adapt to the unexpected.
“Because autonomous systems have not, to date, learned adaptive behavior or an ability to interpret context, they appear to be particularly vulnerable to countermeasures that alter some feature of their expected environment,” according to the RAND Corp. study, which looked at naval unmanned underwater vehicles (UUV).
As with their air and land counterparts, naval drones are reshaping maritime warfare. The U.S. Navy and others are developing UUVs that can detect and sweep mines, as well as conduct reconnaissance and surveillance missions. And lurking in the background are the prospect of armed drones, though for now the world’s militaries stress that no trigger will be pulled without a human in the loop.
But human authorization contradicts one of the advantages of autonomy, which is the ability of machines to function faster and more efficiently than humans. But because robot autonomy has not progressed to the point where robots can handle the unexpected, then human control will be needed when things go wrong.
“A human operator’s ability to recognize that a system is misinterpreting some part of the environment may be a particularly important oversight mechanism,” said RAND. “Such oversight also implies the ability to access the machine’s learning capability to provide images or other means of recognizing something unexpected. So, human operator intervention will occur less when an engagement decision is being made and more when it is apparent that the system is behaving in ways that indicate misunderstanding of the events and conditions around it. An autonomous system must possess an interface that allows periodic assessment of what it senses and how it is reacting to what it senses. The degree to which this is done will depend, in part, on the system operating undetected by adversaries.”
The study also found that doctrine for U.S. Navy UUVs replicates the “kill chain” sequence for manned ships. For example, in the case of surveillance missions in hostile waters by a manned submarine, Navy practice has been to plan the mission, sneak into the area, conduct the mission, analyze the sensor data, transmit it, and then sneak away.
For an underwater drone to do this autonomously, it would need sophisticated capabilities to navigate, optimize sensor use, analyze data to minimize the number of transmissions, and react appropriately to threats. “All these actions involve degrees of cognition, learning, and the ability to react appropriately to changing circumstances,” RAND notes. “Our technological assessment does not indicate that many of these capabilities are imminent, at least not at a level that allows the UUV to perform the tasks that make the activity valuable.”
The study also examined a perennial question with modern weapons: is it better to have one multipurpose platform that can do it all, or multiple, specialized platforms? RAND favors multiple platforms. For example, rather than a single large UUV to detect and destroy mines, the task could be broken down—assuming the requisite communication and coordination capabilities can be developed—so that a group of small drones does the detecting, while another group destroys the mines. A collaborative approach would mean that each individual small drone would need less autonomy to function than a single large craft.
“Underwater vehicles are optimized to do certain things, and it may make more sense to have them individually do parts of the kill chain rather than have one that does every part,” Bradley Martin, a retired Navy captain and co-author of the RAND report, told the National Interest.
Ironically, despite the worries over autonomous robots, RAND argues that autonomy is not the biggest obstacle to using underwater drones. For minesweeping, the most limiting factors are detecting the mines. For surveillance, it’s the endurance and range of the craft, and not whether it can function by itself.
“The fact that autonomy is not generally the limiting factor is important in considering effort and investment,” RAND concludes. “Developing a highly autonomous system that lacks important elements of mission capability adds little warfighting value.”
Michael Peck is a contributing writer for the National Interest. He can be found on Twitter and Facebook.
Image: Flickr