Australia’s defense assessment overlooked one of the biggest transformations in warfare: artificial intelligence
Throughout history, war has been irrevocably changed by the advent of new technologies. War historians have identified several technological revolutions.
The first was the invention of gunpowder by people in ancient China. It gave us muskets, rifles, machine guns and eventually all sorts of explosives. It’s uncontroversial to say that gunpowder completely changed the way we waged war.
Then came the invention of the atomic bomb, making the stakes higher than ever. Wars can be ended with just a single weapon, and life as we know it can be ended with a single nuclear stockpile.
And now war – like so many other aspects of life – has entered the age of automation. AI will cut through the “fog of war” and transform where and how we fight. Small, cheap and increasingly manned unmanned systems will replace large, expensive manned weapons platforms.
We have seen the beginnings of this in Ukraine, where there were advanced armed homemade drones are being developedwhere is Russia using AI “smart” mines which explode when they detect nearby footsteps, and where Ukraine successfully used autonomous “drone” boats in a major attack on the Russian Navy in Sevastopol.
We are also seeing this revolution taking place in our own armed forces in Australia. And all of this begs the question: why, in its recent strategic defense strategy review, has the government failed to seriously consider the implications of AI-assisted warfare?
AI has crept into the Australian military
Australia already has a range of autonomous weapons and craft that can be deployed in conflict.
Our Air Force expects to purchase a number of 40-foot unmanned aerial vehicles Ghost bat aircraft to insure our very expensive F-35 fighter jets are not made by advanced technologies.
At sea, the defense force has tested a new type of unmanned surveillance vessel, called the Bluethroat, developed by local company Ocius. And under the sea, Australia is building a six-meter Ghost Shark prototype unmanned submarine.
It also looks forward to developing many more such technologies in the future. The government’s just-announced $3.4 billion defense innovation accelerator will aim to bring advanced military technologies, including hypersonic missiles, directed energy weapons and autonomous vehicles, to early use.
So how do AI and autonomy fit into our larger strategic picture?
The recent defense strategy review is the latest analysis of whether Australia has the necessary defense capability, attitude and preparedness to defend its interests over the next decade and beyond. You’d expect AI and autonomy to be a major concern, especially as the review recommends spending a not-inconsiderable A$19 billion over the next four years.
Yet the review only mentions autonomy twice (both times in the context of existing weapons systems) and AI once (as one of the four pillars of the AUKUS submarine program).
Countries are preparing for the third revolution
Major powers around the world have made it clear that they see AI as a central part of the planet’s military future.
The House of Lords in the United Kingdom holds a public investigation to the use of AI in weapon systems. In Luxembourg, the government has just important conference on autonomous weapons. And China has announced its intention to become the global leader in AI by 2030. The development plan for the new generation of AI proclaims “AI is a strategic technology that will lead the future” in both a military and an economic sense.
So is Russian President Vladimir Putin stated that “Whoever becomes the leader in this field will become the ruler of the world” – as the United States has done adopted one “third offset strategy” that will invest heavily in AI, autonomy and robotics.
Unless we pay more attention to AI in our military strategy, we risk waging wars with outdated technologies. Russia saw the painful consequences of this last year when its missile cruiser Moscova, the flagship of the Black Sea Fleet, was sunk after being distracted by a drone.
Future regulations
Many people (myself included) hope that autonomous weapons will be regulated soon. I was invited as an expert witness to an intergovernmental meeting in Costa Rica earlier this year, when 30 Latin and Central American countries called for regulation – many for the first time.
Regulations will hopefully ensure that meaningful human control is maintained over autonomous weapon systems (although we haven’t yet agreed on what “meaningful control” will look like).
But regulation will not make AI disappear. We can still expect AI and some levels of autonomy to be essential components in our defenses for the foreseeable future.
There are instances, such as minefield clearing, where autonomy is highly desired. Indeed, AI will be very useful in information space management and in military logistics (where its use will not be subject to the ethical challenges encountered in other environments, such as the use of lethal autonomous weapons).
At the same time, autonomy creates strategic challenges. For example, it will change the geopolitical order alongside cost reduction and scaling. For example, Turkey becomes one great drone superpower.
We must prepare
Australia needs to think about how to defend itself in an AI-assisted world, where terrorists or rogue states could launch swarms of drones against us – and where it might be impossible to determine the attacker. A review that ignores all this leaves us woefully unprepared for the future.
We also need to participate more constructively in ongoing diplomatic discussions about the use of AI in warfare. Sometimes the best defense is in the political arena, not the military one.
This article has been republished from The conversation under a Creative Commons license. Read the original article.
READ NOW: Australia to invest $3.4 billion in defense capabilities accelerator
Contents