ETHOS2022EngagementWS-ReligionPoliticsBSS1360x380pxupdated
6. 6 Jun 2022 FUTURE WARFARE from 20Dec21 WS (1360 x 380px)
6. ETHOS Feature WS-6 JUN 22_Towards a More Coherent Faith Formation
6. ETHOS-Credo WS_6 JUNE 2022 IN THE SHADOW OF THE PANDEMIC
6. ETHOS-Pulse WS_20 Jun 2022_The Other War
6. ETHOS-Credo WS_20 JUN 2022 We Need No Permission to Dance
previous arrow
next arrow

Pulse
6 June 2022

‘The Fourth Industrial Revolution’ writes Bernard Marr in an article published on the Forbes website, ‘describes exponential changes to the way we live, work and relate to one another due to the adoption of cyber-physical systems of the Internet of Things and the Internet of Systems.’ ‘This revolution’, he adds, ‘is expected to impact all disciplines, industries, and economies.’

This explosion of technological innovations and increase in connectivity will not only change the way in which we live, work and play in profound ways. It will also change the way in which wars are fought – it will radically impact and shape future warfare, the scale of conflict as well as its character.

The ramifications of these developments will be so far-reaching that national and international bodies will scramble to come up with protocols to prevent their abuse that could result in the wanton loss of innocent lives. In short, in the wake of the Fourth Industrial Revolution, future warfare will push ethics to venture into new and unexplored frontiers.

Doubtless, one of the most serious threats of our time is a war that is not fought on traditional physical terrains but in cyberspace. Malicious actors could, through the use of technology, bring their enemies to their knees by disrupting, confusing or destroying their sensors, communications and decision-making.

Cyber warfare will prove to be a game-changer. It will not only lower the threshold of war, but will also blur the distinction between war and peace. In addition, in a cyber war the perpetrators – faceless hackers, terrorists, activists, criminals, etc. – are often elusive, and therefore difficult to identify much less monitor.

Cyber warfare also poses immense and unique challenges to the international communities with respect to its regulations on the use of technology.

Scholars like Patrick Lin have observed that ‘International humanitarian laws, or the “Laws of war”, were not written with cyberspace in mind.’ In fact, as Klaus Schwarb has perceptively pointed out, ‘We lack even a taxonomy to agree on what amounts as an attack and the appropriate response, with what and by whom.’

All this has prompted some ethicists to ask if it is indeed possible to wage a just cyber war.

For example, the traditional laws of war recognise only one ‘just cause’ for war, namely, a defence to aggression, defined as acts that have the potential to put human lives in jeopardy. However, as Lin has pointed out, ‘If aggression in cyberspace is not tied to actual physical harm or threat of lives, it is unclear then how we should understand it.’

Future warfare will also increasingly involve the use of military robots, powered by AI. These ‘Robo-wars’, as some writers have dubbed it, will be mostly fought by autonomous machines capable of identifying and obliterating targets without human instruction and intervention.

Because of the autonomous nature of these machines, ‘collateral damage’ in the form of civilian deaths due to malfunction will pose serious ethical and legal problems.

Who is to be held responsible? The manufacturer? The programmer? The army? The country that unleashed these killing machines and whose mission they are carrying out? Or, perhaps the robots themselves should be held accountable, since they are supposedly autonomous?

The question whether the traditional ‘logic of responsibility’ which asserts that ‘he who acts through another does the act himself’ (qui facit per alium facit per se) and ‘let the master answer’ (respondeat superior) are still relevant and applicable when it comes to robo-ethics, especially when autonomous killing machines are involved, is still being debated.

Another issue that theologians, philosophers, ethicists and policymakers have to grapple with in the Fourth Industrial Revolution is dual-use technologies. The European Commission (EC) defines dual-use goods as ‘items, including software and technology, which can be used for both civil and military purposes.’

For example, advances in neuro- science and technology, has made possible computer-brain interfaces that have enabled patients suffering from paralysis to control a robotic arm. But the same technology can also be used for a bionic soldier that can perform tasks that an ordinary soldier is unable to. Similarly, neurological devices that are used to manage Alzheimer’s patients can be modified to enhance the mental prowess of soldiers or even erase their memories.

To be sure, this issue is not new. But with the proliferation of more advanced and innovative technologies, questions concerning their proper governance will become infinitely more complex.

Writers have also speculated that the seabed and space will be increasingly militarised as more and more state as well as commercial actors set up satellites and unmanned underwater vehicles with the capabilities of disrupting satellite traffic and fibre optic cables.

As Schwarb notes, ‘While more than half of all satellites are commercial, these orbiting communications devices are increasingly important for military purposes.’ He alludes to a new generation of ‘glide’ weapons whose deployment would increase ‘the probability that space will play a role in future conflicts and raising concern that current mechanisms to regulate space activities are no longer sufficient.’

More can be said about the challenges that the use of technologies such as nanotechnology, synthetic biology and 3-D printing (the list can easily be expanded) in warfare presents to the international community.

It is clear that no single country can impede the advancements of these technologies or foreclose their use for military purposes. This could only be to some extent achieved when countries work together to establish common ethical guidelines and protocols, and impose prohibitions in the form of international treaties.

But, as many theologians, philosophers and ethicists have repeatedly pointed out, it is also clear that a more concerted effort must be made to work out the profound ethical ramifications of these new technologies.

For the sheer speed and multifaceted impact of technological advances that we are witnessing in the Fourth Industrial Revolution have more often than not left ethics lagging woefully behind.


Dr Roland Chia is Chew Hock Hin Professor at Trinity Theological College (Singapore) and Theological and Research Advisor of the Ethos Institute for Public Christianity.