The “fog of war” is a concept credited to Prussian general Carl von Clausewitz to describe the sense of uncertainty during <a href="https://www.thenationalnews.com/opinion/comment/2024/06/28/syria-isis-al-hol-terrorism/" target="_blank">armed conflicts</a> in the 1800s. Two centuries later, that fog has grown thicker, as people in today’s wars must wade through a torrent of <a href="https://www.thenationalnews.com/mena/palestine-israel/2023/10/11/uk-calls-for-urgent-meeting-with-social-media-companies-to-tackle-gaza-israel-fake-news/" target="_blank">disinformation </a>just when they might need reliable information to survive. “Is the rumour about an imminent <a href="https://www.thenationalnews.com/world/us-news/2024/02/23/us-sanctions-500-russian-targets-on-second-anniversary-of-ukraine-war/" target="_blank">invasion </a>true?” “What is the safest evacuation route?” “Can we trust the opposition force’s WhatsApp group?” “Why is our community leader saying the aid group’s food is contaminated?” When you need to know the safest escape route from an outbreak in fighting, uncertainty and <a href="https://www.thenationalnews.com/business/technology/2023/10/12/tiktok-warned-over-disinformation-following-hamas-attack-on-israel/" target="_blank">false information</a> can lead to death, injury, imprisonment, discrimination or displacement. Moreover, the lies contained in <a href="https://www.thenationalnews.com/opinion/editorial/2023/06/16/an-unprecedented-un-resolution-tackles-the-link-between-extremism-and-violence/" target="_blank">hate-filled narratives</a> can fuel vicious cycles of violence and further entrench already protracted conflicts. Disinformation and escalatory narratives go hand in hand to <a href="https://www.thenationalnews.com/world/us-news/2023/06/14/un-adopts-resolution-acknowledging-effect-of-extremism-and-hate-speech-on-global-peace/" target="_blank">fuel hatred</a> and to dehumanise individuals or groups. A distorted information environment may also influence the behaviour of arms bearers by undermining their respect of international legal and protective frameworks. In fact, <a href="https://www.thenationalnews.com/tags/un/" target="_blank">UN Secretary General </a><a href="https://www.thenationalnews.com/news/us/2024/07/17/humanitarian-situation-in-gaza-a-moral-stain-on-us-all-guterres-says/" target="_blank">Antonio Guterres</a> last month highlighted the issue with a new report that found that, even as technological advances unleash new opportunities at previously unthinkable scale, those advances have also facilitated the spread of misinformation and hate speech “at historically unprecedented volume, velocity and virality, risking the integrity of the information ecosystem”. The new UN Global Principles for Information Integrity demand innovative digital trust and safety practices, particularly reflecting the needs of groups in situations of vulnerability and marginalisation. And they offer concrete recommendations to technology companies, the media, AI actors, advertisers, states and civil society to build digital ecosystems that enable everyone to navigate information spaces safely, sorely needed advances, especially for people trapped in conflict. There is growing evidence that warring parties use social media and other online avenues to enable the spread of co-ordinated and targeted disinformation and hate speech. It’s a constantly evolving risk. The use of <a href="https://www.thenationalnews.com/tags/artificial-intelligence/" target="_blank">artificial intelligence</a> and machine learning has untold potential for the propagation of believable but fabricated, misleading or harmful information. Machine-generated text, images, videos and deep fakes increase the ease and speed at which harmful content can be created and spread. Although disinformation is increasingly debated by governments, academics and international organisations, its impact on populations affected by armed conflict needs to be made more visible. That’s why I was pleased to speak in May at a G20 side event in Brazil on promoting information integrity and tackling disinformation, hate speech and online threats to public institutions. I’m urging authorities to prevent the negative effects on the safety and dignity of populations affected by armed conflict. The <a href="https://www.thenationalnews.com/mena/arab-showcase/2024/07/04/robert-mardini-international-committee-red-cross-arab-showcase/" target="_blank">ICRC</a>, a 160-year-old organisation whose core mandate is to alleviate suffering in armed conflict, is constantly adapting its work to new realities of war. That’s why, between 2021 and 2023, the ICRC convened an advisory board of high-level legal, military, policy, technological and security experts to advise the organisation on digital threats. Last year, this board published a report with recommendations on the prevention and mitigation of these threats to belligerents, states, tech companies and humanitarian organisations. Other aid groups are also working to respond to disinformation. Recently, the ICRC hosted several international organisations, including Doctors Without Borders (MSF) and the UN High Commissioner for Refugees, to address these challenges. The goal is to create a shared response framework to help guide humanitarian organisations on how to best respond to harmful information. No one organisation can do this alone; we all must work together to reduce these risks for vulnerable people. Some guidance for these efforts can be found in international humanitarian law, also known as “the law of war”. This is a set of rules that protects people who are not or no longer participating in hostilities and restricts the means and methods of warfare. International humanitarian law lays down certain prohibitions that can be applied to the use of the technology, including the use of any platforms, including social media, to incite attacks against civilians, against civilian objects, or against wounded, sick or detained enemy soldiers. There are also prohibitions against carrying out acts or threats of violence to spread terror among the civilian population. An example would be the hacking into communication networks to propagate false air-raid alarms or spreading disinformation that purposefully obstructs the operations of humanitarian organisations. We may never be able to entirely eliminate the fog of disinformation during armed conflicts, but the harm that it causes can be prevented or mitigated. More effort is needed – immediately – from governments, traditional media outlets, social media platforms and civil society actors to improve the distribution of reliable information and reduce disinformation as much as possible. Stakeholders need to ensure that the specific vulnerabilities and requirements of people in conflict settings are adequately considered and addressed. More can, and should, be done to ensure that information ecosystems, offline and online, do not become vectors of threats and insecurity for vulnerable people. The international community needs to do better to prevent and mitigate the negative spiral of violence and insecurity.