View all newsletters
Receive our newsletter – data, insights and analysis delivered to you
  1. Policy
  2. Geopolitics
May 21, 2022updated 27 Jun 2022 4:40am

Balthasar Staehelin

New technologies risk making warfare deadlier for civilians

Humanitarian organisations need smarter ways to prevent more people getting caught in the crossfire.

Today’s wars are not waged solely through kinetic operations on the battlefield. They are fought across multiple domains that see sophisticated new technologies harnessed alongside more traditional munitions. The Ukraine conflict is just the latest example.

The impact of new technologies on conflict, humanitarian action and international humanitarian law (IHL) is of growing importance to the International Committee of the Red Cross (ICRC.) New technology is not only changing the means and methods of warfare, but also the ways in which humanitarian actors respond.

Of the many issues and concerns that new technologies raise for civilians in conflict, four areas deserve particular attention: data protection; misinformation, disinformation and hate speech; cyber warfare; and autonomous weapon systems.

new technologies warfare
An elderly woman in a wheelchair is evacuated from Irpin, Ukraine. Russia’s invasion of Ukraine has shown how new technology has increased the chances of civilians being caught in the crossfire. (Photo by ARIS MESSINIS/AFP via Getty Images)

Cybersecurity for civilians as new technologies make warfare deadlier

Data is often referred to as the new oil, but it could just as easily be compared to asbestos. As humanitarians, we collect data from incredibly vulnerable people – refugees, prisoners of war, detainees, people at risk of persecution, to name but a few. While collection and management of this personal data can make us more effective in getting help to the individuals concerned, it can be a matter of life and death if it falls into the wrong hands. As such, we have to be extremely careful about minimising the data we collect and protecting what information we hold. 

The ICRC operates in some of the most dangerous contexts in the world; consequently, we always seek to build trust and respect with combatants so they do not attack us. We know full well, however, that there is no guarantee against attack. Unfortunately, the same situation applies online. Earlier this year we discovered a sophisticated cyberattack against ICRC servers hosting data belonging to more than 515,000 people worldwide. Of course, there are lessons to be learned in terms of improving our cyber defences, and we acknowledge that military-grade cybersecurity cannot be a realistic ambition for humanitarian organisations. Perhaps it’s also time for the ICRC to start engaging with relevant groups committing cyberattacks to build the same trust and respect for our purely humanitarian mission as we would do with armed combatants.

In recent years, numerous cyber incidents have affected civilian infrastructure. Often, these incidents occur in contexts of political tension or armed conflict. Cyber operations that disrupt medical facilities or interrupt energy and water supplies pose a significant risk to civilian populations. Our view is clear: cyber tools must be designed and deployed in compliance with IHL. In other words, cyberattacks should not be directed against civilian infrastructure, in the same way that hospitals or power plants should not be bombed.

Data and software, including artificial intelligence and machine learning tools, offer potential benefits for humanitarians. They could, for example, be used to help reunite families by analysing large amounts of data, or to inform the design and delivery of humanitarian interventions. At the same time, AI and machine learning can also be used to automate decisions on who or what will be attacked and when.

Content from our partners
Webinar - Top 3 Ways to Build Security into DevOps
Tech sector is making progress on diversity, but advances must accelerate
How to bolster finance functions and leverage tech to future-proof operational capabilities

Let’s be clear. Autonomous weapon systems are not a work of science fiction from a distant dystopian future. They are an immediate cause of humanitarian, legal and ethical concern and need to be addressed now.

For the ICRC, autonomous weapons select and apply force to targets without human intervention. After initial activation by a person, an autonomous weapon triggers a strike in response to information from the environment – received via sensors and software – and on the basis of a generalised ‘target profile’. The user does not choose, or even know, the specific target, nor the precise timing or location of the strike. The lack of human control and judgement – and the difficulties anticipating and controlling the effects that result – are at the heart of our concerns. The ICRC is recommending that states adopt new legally binding rules to prohibit unpredictable autonomous weapons and those that apply force to people directly, and to establish strict constraints on all others.

Propaganda wars

The final main digital risk facing civilians in conflict zones is misinformation, disinformation and hate speech (MDH). With the ubiquity of digital technologies and communication systems in humanitarian settings, the ‘fog of information’ is accelerating and exacerbating the ‘fog of war’. This brings with it new layers of complexity, uncertainty and risks for populations and communities affected by conflict and violence.

MDH has always loomed large in conflict zones, but the digitalisation of societies and information ecosystems have introduced new paradigms of scale, speed and reach. While a lot of attention is being paid to how MDH affects democracy and public health, there’s an urgent need to focus on conflict areas, where risks are high and safeguards and resilience mechanisms are low.

In warzones, what is said online can have damaging repercussions in the real world, not only for civilians, but for those trying to protect them. Indeed, the ICRC has found itself the subject of misinformation and disinformation in the Ukraine conflict. It is not the first time we have been targeted and I am sure it will not be the last. But every time it happens, it puts Red Cross staff, volunteers and the people we seek to help at risk. 

We have to learn to better manage how MDH impacts our operations, as well as those citizens who find themselves at the epicentre of conflict zones. Indeed, all humanitarians and armed actors need to think carefully about the true impact of new technology in conflict zones. We all have responsibilities. For those waging war, they need to assess the risks to civilians and ensure full compliance with international law. Humanitarian organisations, meanwhile, need to use technology responsibly and make sure we do no harm. If the ICRC wants to continue to be useful on the battlefield, then we will have to continue adjusting and evolving as technology advances. Lives depend on it.

Register here to watch a Q&A with Balthasar Staehelin from Tech Monitor‘s Digital Responsibility Summit on demand

Topics in this article:
Websites in our network
NEWSLETTER Sign up Tick the boxes of the newsletters you would like to receive. Tech Monitor's research, insight and analysis examines the frontiers of digital transformation to help tech leaders navigate the future. Our Changelog newsletter delivers our best work to your inbox every week.
I consent to New Statesman Media Group collecting my details provided via this form in accordance with the Privacy Policy
SUBSCRIBED

THANK YOU