Misinformation is often the symptom, not the disease
Misinformation, disinformation, propaganda, conspiracy theories, distrust, and more.
[Note: this essay was originally published in December 2023 at The Institute of Art and Ideas’ website: https://iai.tv/articles/misinformation-is-the-symptom-not-the-disease-daniel-walliams-auid-2690]
Since the United Kingdom’s Brexit vote and the election of Donald Trump in 2016, we have been living through an unprecedented societal panic about misinformation. Poll after poll demonstrates that the general public is highly fearful of fake news and misleading content, a concern which is widely shared among academics, journalists, and policymakers.
This panic is driven by the narrative that misinformation is a kind of societal disease. Sometimes this metaphor is explicit, as with the World Health Organisation’s claim that we are living through an “infodemic” and influential research that likens misinformation to a contagious virus, but it also motivates the common diagnosis that misinformation lies at the root of many societal ills. On this analysis, ordinary individuals are routinely sucked into online rabbit holes that transform them into rabid conspiracy theorists, and misinformation is the driving force behind everything from vaccine scepticism to support for right-wing demagogues.
The disease model of misinformation has practical consequences. If misinformation is a societal disease, it should be possible to cure societies of various problems by eradicating it. The result is intense efforts among policy makers and companies to censor misinformation and reduce its visibility, as well as numerous initiatives that aim to cure citizens of false beliefs and reduce their “susceptibility” to them.
Is the disease narrative correct? In some cases, exposure to misinformation manifestly does have harmful consequences. Powerful individuals and interest groups often propagate false and misleading messages, and such efforts are sometimes partly successful. Moreover, evidence consistently shows that the highly biased reporting of influential partisan outlets such as Fox News has real-world impact.
Nevertheless, the model of misinformation as a societal disease often gets things backwards. In many cases, false or misleading information is better viewed as a symptom of societal pathologies such as institutional distrust, political sectarianism, and anti-establishment worldviews. When that is true, censorship and other interventions designed to debunk or prebunk misinformation are unlikely to be very effective and might even exacerbate the problems they aim to address.
To begin with, the central intuition driving the modern misinformation panic is that people—specifically other people—are gullible and hence easily infected by bad ideas. This intuition is wrong. A large body of scientific research demonstrates that people possess sophisticated cognitive mechanisms of epistemic vigilance with which they evaluate information.
If anything, these mechanisms make people pig-headed, not credulous, predisposing them to reject information at odds with their pre-existing beliefs. Undervaluing other people’s opinions, they cling to their own perspective on the world and often dismiss the claims advanced by others. Persuasion is therefore extremely difficult and even intense propaganda campaigns and advertising efforts routinely have minimal effects.
To many commentators, these findings are difficult to accept. If people are not gullible and persuasion is difficult, what explains the prevalence of extraordinary popular delusions and bizarre conspiracy theories? This question embodies a widespread but confused assumption, however: that the truth is always self-evident and desirable, such that false beliefs can only be explained by the credulous acceptance of misinformation.
First, the truth about complex and often distant states of affairs is not self-evident. In forming beliefs, citizens rely on interpretive dispositions and intuitions that are not well-aligned with truth or contemporary scientific consensus. Indeed, the very reason that we need science and expertise is precisely because the truth is often highly counter-intuitive. When it comes to topics as diverse as vaccines, nuclear power, GMOs, and the nature of complex, modern societies, most of us therefore start out with pre-scientific intuitions. For example, many people’s intuitive sense of disgust is activated at the thought of being injected with (what they imagine to be) a live disease, and a deeply entrenched omission bias causes people to fear the consequences of being vaccinated (an act of commission) more than the consequences of not being vaccinated (an act of omission).
To overcome such intuitions, people must encounter and accept reliable information. Of course, defining what constitutes reliable information is as challenging as defining misinformation. Fallibility, bias, and error are ineliminable features of the human condition, including within our leading epistemic institutions. Nevertheless, precisely because modern science implements procedures designed to overcome human frailties and biases, such as peer review, open debate, and distinctive social norms, consensus views among diverse experts tend to be broadly reliable. Similarly, even their critics acknowledge that mainstream media outlets in democratic societies that adhere to norms of journalistic objectivity (e.g., fact-checking, balance, and accountability) tend to be mostly reliable when it comes to reporting on narrow matters of fact.
Unfortunately, not only do most citizens not pay much attention to politics or the news, but a minority actively distrust institutions such as modern science, public health authorities, and mainstream media. The causes of this distrust are complex and diverse. They include psychological traits that predispose some people towards paranoid worldviews; institutional failures, such as telling noble lies to manage public behaviour, and dismissing legitimate ideas as conspiracy theories; and feelings—often justified—of exclusion from positions of power and influence. Whatever its causes, however, such distrust often drives people to seek out information—commonly misinformation—from counter-establishment sources and reject information from mainstream ones.
Second, the truth is not always desirable, nor easy to accept. When it comes to domains such as politics and culture, human beings are not disinterested truth seekers. The competing sides in political debates and culture wars often behave more like warring religious sects than groups organised around coherent worldviews. Their members embrace beliefs and narratives that signal their tribal allegiances, cast their own group in a favourable light, and derogate their rivals and enemies. Similarly, just as those in power often seek to embrace worldviews that affirm and rationalise their superiority, members of the general public who despise “elites” and the “establishment” are often eager to embrace narratives that demonise them, sometimes in the most extreme way possible (e.g., by casting them as Satanic paedophiles).
These motivations to embrace biased beliefs cause people to seek out belief-justifying information. The result is a marketplace of rationalisations that rewards the production and dissemination of content that supports favoured narratives in society. We tend to view the super-spreaders of misinformation as master manipulators, orchestrating mass delusion from their keyboards and podcast appearances, but they are often better understood as entrepreneurs who use their rhetorical skills to affirm and justify in-demand beliefs in exchange for social and financial rewards. Beyond Merchants of Doubt they are merchants of affirmation, and for the right price they'll validate and rationalise anything.
The importance of factors such as institutional distrust, polarisation, and rationalisation markets implies a very different picture of misinformation, one in which it looks less like a disease than a mirror reflecting deeper societal pathologies. As the legal scholar Dan Kahan puts it, on this picture “misinformation is not something that happens to the mass public but rather something its members are complicit in producing.”
There is considerable evidence for this analysis. First, although some experimental evidence suggests that people can be persuaded to abandon false beliefs, such interventions rarely cause people to change more basic attitudes, such as voting or vaccination intentions, suggesting that consuming misinformation often serves to rationalise pre-existing inclinations rather than cause them.
Second, as with political media generally, misinformation largely preaches to the choir. For example, people consume and spread political misinformation that supports their favoured groups and causes, and members of online conspiratorial communities are not a cross-section of the population but people with specific motivations, identities, and predispositions. The consumers of misinformation are therefore rarely passive victims of false information; they actively seek out and engage with biased content and often define their identity by commitment to certain beliefs and worldviews.
Third, studies suggest that the most important predictors of misinformation are factors such as polarisation, institutional distrust, and the level of governmental corruption. For example, conspiracy theorising thrives in countries with higher levels of corruption (and hence actual conspiracies) and is less common in countries with more inclusive, democratic, and transparent forms of governance. Further, misinformation flourishes in contexts in which polarised groups strongly dislike each other, whether they are Republican and Democrat voters in the USA or citizens whose countries are embroiled in military conflict. Indeed, experts on misinformation agree that partisanship and identity are key drivers of misinformation, whereas lack of access to reliable information plays a negligible role.
Finally, research exploring what happens when the supply of misinformation is restricted suggests that it often does not reduce people’s overall engagement with it. For example, when misinformation on Facebook was reduced during an accidental outage, people simply searched for misinformation elsewhere, and Meta’s intense efforts to censor anti-vaccine content during the Covid-19 pandemic did not reduce engagement with such content on the platform, precisely because people circumvented the censorship and sought it out in other ways.
When this analysis of misinformation as a symptom rather than a disease is correct, many existing attempts to address the problems associated with misinformation are likely to be ineffective. Most obviously, if widespread false beliefs are symptomatic of deeper social pathologies, we should not expect to cure them just by censoring misinformation, and weaker interventions involving debunking or prebunking false ideas seem unlikely to change much real-world behaviour.
Worse, there is a real risk that censorship exacerbates the very problems it aims to address. For those who already distrust institutions, using those institutions to ban dissenting narratives seems likely to aggravate their distrust. Censorship is a flagrant display of power and disrespect towards those whose views are censored, and for the conspiratorially minded it is exactly what 'they' (e.g., elites, the establishment, and so on) would do to prevent people from finding the truth. Moreover, from its roots in the backlash to Brexit and the 2016 election of Trump, the current panic surrounding misinformation is undeniably partisan. Insofar as supporters of right-wing populist movements feel selectively targeted by censorship, as many clearly do, this seems more likely to inflame polarisation than improve it.
What, then, might help? First, rather than investing so much energy and investment into preventing the public from being infected with false information, it is far more important to win trust in institutions. To achieve this, the most important thing that policy makers and politicians can do is to make these institutions more trustworthy. Distrust is not always irrational. For many, it arises from legitimate grievances. As sociologist Musa al-Gharbi documents, for example, during the Covid-19 pandemic authorities in America often behaved appallingly: messaging was frequently inconsistent, deceptive, partisan, and hypocritical; financial interests were bound up with government decisions in suspicious ways; and the costs of pandemic policies were disproportionately borne by the most disadvantaged and vulnerable.
More broadly, policies should aim at addressing the social conditions that make people avid consumers of misinformation. Intense polarisation in society often drives citizens to view the world as a cartoonish battle between Good and Evil, and steep inequalities of power and status ensure a widespread demand for hyperbolic narratives that demonise elites and the establishment. It is far more difficult to address such issues than to implement simple interventions such as censorship and fact-checking, but if we are serious about improving the epistemic health of society, we must.
What would an epistemically healthy society look like? Institutions in government, science, and the media would be diverse, inclusive, transparent, and accountable. Citizens would feel that they have a stake in public decision making and have the resources and opportunities to participate within it. Experts and policy makers would engage in honest communication, acknowledging uncertainty and treating the public as rational agents capable of handling hard truths, not panic-prone populations to be managed with strategic messaging. And perhaps most importantly, there would be enough social trust that everyone – citizen and policy makers alike – would genuinely try to understand why others disagree with them instead of dismissing other views as the result of brute irrationality or brainwashing.