Monday, July 22, 2024

Welcome to the Next War: the AI Triple Black Box and Accountability

Professor Ashley Deeks
Greeting from "The Double Black Box: National Security, Artificial Intelligence, and the Struggle for Democratic Accountability" by Professor Ashley Deeks, University of Virginia. This is a public part of the conference "Anticipating the Future of War: AI, Automated Systems, and Resort-to-Force Decision Making" hosted by the Coral Bell School of Asia Pacific Affairs at the Australian National University. Professor Deeks' thesis is that defence AI is a block box both because even the programmers don't know what it is doing and if they did it would be secret. I suggest the situation is not that bad: it is possible to build AI systems which can be asked why they made a decision. But as the recent Microsoft/Crowdstrike failure shows, even non-AI systems can do surprising things. There is also cause concern, as Professor Deeks pointed out, due to the scale of use.

At a practical level it is not that difficult to test if an AI weapon is at least as reliable as a human operator. This could improve procedures by making explicit the decision making processes. There will be pressure to use advanced automated systems, just as there are for current simple ones, such as mines. 

Professor Deeks is presenting a US-centric view of the issues. However, the US is not a leader in development of AI weapons. Any country with a university having a computing school has the capability to make advanced AI weapons. Recently I was assessing a university student project for a small autonomous vehicle. This was for civilian purposes, but one version was tracked, and just needed a weapon added to be a robot tank.

The problem, I suggest, could be far harder than Professor Deeks suggests. The magic sauce for an AI weapon is in the software. The physical weapon can be upgraded over the air to have new capabilities. Some of this has been seen with missiles, where air launched missiles have been adapted for surface launch & surface for air. An example is the US Navy's SM-6 ship missile adapted for air launch against surface, air and space targets. Deciding of something is an anti-satellite weapon or not is a matter of software. 

Professor Deeks mentioned her paper "The judicial demand for explainable artificial intelligence" (2019) which argued for lawyers to get AI savvy. Some are thinking tech, such as Herbert Smith Freehills.

No comments:

Post a Comment