Disinformation is information that’s intended to mislead – whether for profit, harm, or to advance political and ideological objectives. Historically, the misuse of information is nothing new, but thanks to social media, its scale and reach came to the public’s attention in 2016 with reports of active disinformation campaigns aimed at the U.S. elections and the Brexit referendum.
Today, disinformation continues to increase around the world, harnessing social media and the generative ability of artificial intelligence (AI) to easily create images and videos that erode public trust in government, institutions, and the media with direct and indirect impacts on USAID programs around the world.
For example, in 2020, COVID-19 disinformation made it much more difficult for global institutions, local governments, and health practitioners to combat the global pandemic. In West Africa, Russian disinformation leveraged colonial legacies to erode public trust in democratic institutions, contributing to destabilising conditions and a wave of military coups in recent years. In Eastern Europe, Palladium is working to counter disinformation campaigns against Ukrainian refugees in hosting communities.
The increasing polarisation we are witnessing across many areas of the world, such as within the frame of the great powers’ competition between the U.S. and China or geopolitical conflict between the West and Russia—coupled with increasingly easy ways to create and disseminate information—means that we are often working in contested spaces where information and narratives have the power to shape development outcomes. These outcomes are varied and can include everything from vaccine uptake, climate change action, or support for democratic governance worldwide.
It's a complex issue that can impact people around the world, but disinformation programming often focuses on a narrow subset (the individual consumer of information) and a particular mitigation method called “pre-bunking”.
The Concept of Pre-Bunking
Pre-bunking is an approach that draws parallels from public health. The idea is to expose people or targeted populations to a weakened version of disinformation (the virus) to enable the development or strengthening of critical thinking skills (the antibodies of disinformation) to become more resilient to its effects.
At its heart, pre-bunking is about showing people the tactics of misinformation before they actually encounter it, so that they’re better equipped to recognise it and resist when they do come across it.
While pre-bunking can be effective to a degree, it’s also insufficient because it does not address the very human aspects of disinformation. Which can include how people’s desires, experiences, identities, etc. are leveraged, distorted, and/or manipulated, or the social nature of information ecosystems in which the ways people consume, contribute, and engage with information can impact interpretations.
Rather, pre-bunking is a tactic that relies on an individual’s ability to parse out truth from fiction—an increasingly challenging task due to evolutions in generative AI. Research has also demonstrated that people are selective in their consumption and interpretation of information; we’re drawn to information that confirms our viewpoints, contributing to self-selected information silos.
Drawing another parallel from public health, inoculations are effective when the makeup of the virus remains the same. Certain vaccines must be updated periodically to protect against mutations or to shore up protection which can fade over time. As such, we must also acknowledge that disinformation is not static, but evolves constantly and rapidly to maintain relevancy and to capture attention.
A Systems Approach: The Power of Collective Sense-Making
Beyond mitigation measures such as pre-and de-bunking, we need to maintain active awareness of disinformation challenges in the communities and contexts in which our teams work and support those societies and communities to better address or be resilient to its impacts.
This requires broadening the focus of disinformation programming to the collective, because disinformation spreads, but can also be countered through, systems and networks of social connection.
Our consumption and interpretation of information is part of a collective sense-making process involving dialogue and socialisation that enables communities to engage, make meaning of their information environments, and find solutions together. While this may sound relatively simple, it has become increasingly challenging as disinformation fuels and is fuelled by increasingly divergent interpretations (for example, between climate “alarmists” and “deniers,” or pro and anti-vaccination camps) that wear away the common ground in societies for positive action and change. For some bad actors in the disinformation space, eliminating common ground is precisely the point in weakening social and civic discourse and consensus while sowing mistrust.
To combat disinformation, we need a systems approach to understanding and addressing its particular challenges in various contexts. Active network analysis, political economy analysis, and mapping of information actors and systems by development practitioners can form the basis of interventions that can support all stakeholders (communities, government, civil society, and the media) to identify disinformation challenges and create or strengthen ecosystems of resiliency.
For more information, contact email@example.com.