Wednesday, June 28, 2023

Weaponizing ChatGPT

Professor Sarah Kreps,
Cornell University
Since late last year I have been part of presenting, and attended, many seminars on ChatGPT, but this is a first: "Weaponizing ChatGPT?  National Security and the Perils of AI-Generated Texts in Democratic Societies". In this case "weaponising" is not a metaphor, or hyperbole, it really is about using AI to wage war. Greetings from the Australian National University, where a group of defence and AI experts from around the world are meeting, sponsored by the Australian Department of Defence. . This is the sort of seminar where not only have the speakers just published a new books, but so have half the audience. It is a little intimidating.

Professor Sarah Kreps, Cornell University, explained she was in the US Air Force engineering, working on the predator UAV, and AI was a short step from there. She suggested AI started in the civilian sector, and will then be adopted by the military. I am not so sure that is the case. What became web search engines, came out of research sponsored by US DoD. Some of the AI research is similarly sponsored. 

From the Professor's description, the US approach seems to be limited to what the threat might be from an adversary's use of AI. What I suggest western military's need to do is consider how they will use AI. As an example, if AI is being used defensively to create plausible fake news to undermine your nation, how can it be use to create instant factual responses, or offensively to create a largely factual narrative to undermine the enemy. 

Professor Kreps characterized western countries as an open system which could be exploited by misinformation, and AI can be used to enhance this. The result would be customized fake news to appeal to specific groups. 

It would be interesting to see what the ADF's Information Warfare Division is doing with generative AI. Just as Australia's military cyber-security experts have had an increasing role protecting government and civilian systems, with its name changed from Defence Signals Directorate, to Australian Signals Directorate. Is there a similar role for IWD?

Professor Kreps suggested the vocabulary used could indicate when generative AI is used. I suggest that can't be relied on. Professional speechwriters know to use the language and cadence of their customers. It will not be difficult for AI to write using the language of a particular individual, or group.

This would appear to be an area where DARPA, its UK & US equivalents, could provide funding for universities. This could produce free open access tools to counter misinformation.

No comments:

Post a Comment