Strategic paranoia: When irrational fears serve rational ends
The role of strategic paranoia in witchcraft accusations, conspiracy theories, the Satanic panic, tribal narratives, the mafia, and sexual jealousy.
In my last essay, I explored why paranoia is such a common failure mode of human minds. My answer focused on the fragility of “social vigilance”, the psychological processes we evolved to detect and respond to the most complex and dangerous threat we confront: other people.
Unlike other threats (e.g., physical accidents, diseases, and predators), people are rational agents with sophisticated planning, coordination, and deception abilities. They can conceal their intentions, anticipate our responses, and collaborate to plan and orchestrate elaborate schemes to control, manipulate, or even exterminate us.
To manage such threats, social vigilance drives us to infer threatening intentions from subtle and ambiguous clues and ruminate on a wide range of threat-based possibilities. Because such processes are inherently complex and fragile, they often generate false alarms and occasionally spiral into paranoid delusions, especially when we experience psychological disorders and feelings of vulnerability.
The utility of paranoia
This analysis treats paranoia as a maladaptive mistake, an outcome that is difficult but desirable to avoid. Given this, it tells only half the story. Although an exaggerated fear of others can be a cognitive error, it can also be a strategy.
In many cases, paranoia is not a trap people fall into but a weapon they wield to advance their interests. This phenomenon occurs in witchcraft accusations, conspiracy theories, tribal narratives, moral panics, criminal organisations, sexual jealousy, and many other cases. As with many irrational beliefs, it is rooted in strategic, social motives of propaganda and signalling.
The social functions of irrationality
Highly irrational beliefs are sometimes best understood as mistakes. A classic example is the bizarre delusions associated with psychotic disorders such as schizophrenia. Although we still do not fully understand how such delusions arise, there is nothing fundamentally puzzling about why they occur in these contexts. If parts of the mind “break down” or become disordered, it is unsurprising people adopt strange beliefs.
Things are different when healthy (“neurotypical”) people embrace such beliefs. If people are not suffering from any psychological dysfunction or disorder, why would they embrace weird ideas? What explains extraordinary popular delusions, the kind endorsed not by the mentally ill but by otherwise ordinary and even flourishing individuals?
To be clear, nothing is puzzling about why people believe false things. Reality is complex. The truth is not self-evident. Information is so limited and uncertain in many domains that even perfectly rational agents are likely to form inaccurate beliefs. Instead, the puzzle is why psychologically healthy people embrace highly irrational beliefs—beliefs that are not just false but seemingly absurd.
Consider, for example, the belief that members of one’s community are demonic witches plotting against others in orgiastic nighttime gatherings, or that Satanic cults are snatching children from daycare centres to involve them in elaborate rituals, or that devil-worshipping paedophiles secretly run the deep state and Hollywood.
Such beliefs are not simply mistaken. They seem mad—and yet they are professed by manifestly non-mad people.
What is going on?
Motivated irrationality?
One classic answer to this puzzle appeals to self-deception and, more broadly, motivated irrationality. Here, the idea is that people sometimes embrace weird beliefs because doing so promotes their goals. That is, they believe what they want to believe rather than paying attention to the evidence.
Self-deception is widespread and highly consequential in shaping human beliefs. As Adam Smith observed, “This self-deceit, this fatal weakness of mankind, is the source of half the disorders of human life.”
However, traditionally, people have understood self-deception with a “you can’t handle the truth!” model. They assume that self-deception is about convincing ourselves of things we want to be true. It is a kind of wishful thinking, a process where we confuse our beliefs and desires.
Among the many problems with this analysis, it cannot explain irrational, paranoid beliefs involved in witchcraft accusations, conspiracy theories, moral panics, and so on. Nobody wants it to be true that witches are plotting against them or that sinister and secretive Satanic cults threaten their children.
The problem also applies to tribal narratives throughout human history in which members of different groups (ethnicities, sects, religions, nations, etc.) endorse paranoid beliefs about the threat posed by other groups. The Nazis did not want it to be true that Jews were a sinister powerful force in the world, and yet they were fanatically committed to this paranoid fantasy.
One can see a similar phenomenon in many everyday behaviours. Consider, for example, the paranoid husband constantly leaping to irrational suspicions about his wife’s infidelity. In such cases of “twisted self-deception”, people seem highly motivated to consider and embrace beliefs they do not want to be true—that they find distressing, alarming, and even nightmarish.
Again: What is going on?
The social roots of self-deception
As with many apparent forms of irrationality, the roots of extraordinary popular delusions lie in humanity’s social nature.
Humans are intensely social animals. The most important factor shaping human evolution was, therefore, other humans. Most of our most important goals are social (status, belonging, trust, mating, relationships, etc.) and even our ability to achieve non-social goals has always depended on extensive social support.
Given this, the human brain is highly specialised for achieving a wide range of complex social outcomes. This is true of the processes that underlie our beliefs. Although most beliefs simply constitute our mental model of reality, some are socially adaptive. They are shaped and distorted by social goals that conflict with the pursuit of accuracy.
Instinctive propagandists
One such goal is propaganda.
When people think of propaganda, they imagine deliberate, top-down efforts by elites and organisations to shape public opinion. However, humans are instinctive propagandists. Because we benefit from influencing what others think, much of human cognition is devoted to persuasion and advocacy. We instinctively communicate in self-serving and self-aggrandising ways, sharing and omitting information in ways designed to boost our status, win social approval and trust, and discredit and demonise our rivals and enemies.
Moreover, we tend to internalise this self-serving propaganda, partly because this makes us more effective propagandists. For this reason, instinctive propaganda rarely feels like propaganda. By sincerely embracing those ideas that advance our interests, sharing those ideas with others feels like honesty, not manipulation.
Instinctive impression managers
Another social goal that distorts beliefs is social signalling.
Humans are obsessive impression managers. Because our survival and success depend on what others think of us, we invest enormous energy into influencing such impressions. This social signalling drives many puzzling human behaviours, from wasting time and resources to flaunt our status and wealth to participating in elaborate, time-consuming rituals to advertise our allegiance to religious communities.
It also drives many puzzling human beliefs.
As Robin Hanson has observed, many beliefs are like clothes. We adopt them to display our attractive qualities and allegiances, not to determine what is true.
Consider, for example, the belief that a powerful supernatural agent (e.g., God) or force (e.g., karma) punishes immoral behaviour. Although there is no compelling evidence for these supernatural enforcers, the belief in their existence sends an adaptive message: I will be moral.
More generally, we adopt beliefs and narratives to signal our class identities, tribal allegiances, and personalities, including how nice or dominant we are.
Propaganda, performance, and paranoia
Many forms of paranoia have their roots in these social goals. They are neither mistakes nor symptoms of cognitive dysfunction but the outcome of strategic processes designed to spread self-serving propaganda or signal the paranoiac’s traits.
Propagandistic paranoia
One way that paranoid beliefs serve propagandistic functions is through demonizing narratives.
As the name suggests, demonizing narratives demonize target individuals and groups. As a form of propaganda, they serve several valuable functions.
First, human societies enforce norms against unjustified aggression and hostility. So, if you want to exploit, victimise, manipulate, or eliminate people, you need a justification. Demonizing narratives satisfy this need: by characterising targets as possessing sinister qualities or engaging in sinister acts, you provide an excuse for hostility towards them.
Second, demonizing targets also serves to increase other people’s fear and antagonism towards them. Notably, many features that make social vigilance—the detection of social threats—challenging make paranoid propaganda appealing. For example, as I noted in the previous essay, once paranoid suspicions arise, they are often uniquely difficult to falsify. What would disconfirm the hypothesis that people are secretly conspiring? If the worry is that they are deliberately trying to avoid detection, you should expect it to be challenging to find evidence of their activities.
Paranoid propaganda can exploit this phenomenon. Demonizing narratives are often designed to be difficult to disprove. This makes it challenging for their targets to refute them decisively and easier for propagandists to avoid clear-cut refutations of their paranoid fantasies.
Witches, conspiracies, and panics
One example of demonizing narratives is the cross-cultural phenomenon of witchcraft accusations.
When community members are motivated to attack someone—for instance, because they are a burden on others or to reduce their status—depicting them as powerful, vile, and threatening can serve useful propagandistic purposes.
As Pascal Boyer writes,
“It helps to see witchcraft accusations as a form of stigmatization, providing a coordination point for coalitional alignment against a particular individual… People who have some interest in inflicting harm on a particular individual may use witchcraft accusations rather than a direct attack because the accusation makes it possible to recruit allies against the target, whilst maintaining one’s own reputation.”
One can also see demonizing narratives in many conspiracy theories. From the Nazi’s paranoid delusions about the Jews to modern-day QAnon believers’ fantasies about devil-worshipping elites, conspiracy theories exaggerate the threat posed by target groups in extreme ways. In so doing, they excuse attitudes of extreme hostility towards them.
For this reason, demonizing narratives also typically precede pogroms, ethnic riots, and genocides. By depicting victims as evil and highly threatening—for example, by spreading paranoid rumours about horrific things they have done or baseless theories about their hidden but insidious influence on society—such narratives usefully justify the targets’ elimination.
Moral panics
You can also find paranoid propaganda in many moral panics. In some cases, it drives such panics to begin with. However, with others, it sustains them once they arise.
For example, throughout the 1980s and early 1990s, the “Satanic panic” spread throughout much of the Western world. The panic was primarily driven by the widespread belief that powerful Satanic cults would snatch children from daycare centres and torture them in elaborate rituals featuring depravities like child sacrifice, blood drinking, and foetus eating. Although there was no evidence for such paranoid fantasies, many thousands were accused of such crimes.
At first, the panic seems to have been driven by sincere but mistaken fears—by maladaptive fears. For example, during the 1980s, the USA experienced several events and trends that lent plausibility to the initial accusations among many people: a resurgence of evangelical Christianity, a society-wide reckoning with the reality of child abuse, seemingly credible “experts” who testified to the reliability of phenomena such as repressed memories, and sincere testimony from adults reporting their childhood experiences of Satanic ritual abuse.
Nevertheless, as the panic raged on and it became increasingly clear that the accusations were baseless, many of the “Satan hunters” who had pushed and amplified the panic acquired strategic motivations for keeping it going.
As Will Storr argues in an extraordinary analysis of the episode, the panic had been financially and reputationally lucrative for the many journalists, therapists, social workers, and other professionals who had devoted themselves to fighting the imaginary Satanic cults.
As with many such “moral panics”, then, Storr observes that much of its “explosive energy” derived not “from panic but from desire for acclaim” among people who experienced tremendous
“status in the form of influence, acclaim, cash, fame, proximity to the prestigious games of law, media and government and the reputation of an avenging angel, defending the lives of America’s children.”
In other words, the society-wide paranoia involved in moral panics can benefit those who acquire reputations as fighting the imagined threat.
Expressive paranoia
Just as paranoia can serve propagandistic functions, it can signal useful messages.
This signalling function plays a role in demonizing narratives. By painting individuals and groups as sinister threats, such narratives do not only justify hostility towards them. Communicating such narratives also sends an unambiguous signal of the communicator’s intentions.
For example, if you accuse somebody of witchcraft, you leave others with no possible doubts about your attitudes and feelings towards them. For this reason, the accusation facilitates social coordination: others learn that they can expect your support if they attack the target of the allegations.
Something like this probably plays a role in modern-day conspiracy theories. If people want to form a community organised around contempt for elites and the establishment, a person’s willingness to endorse hyperbolic demonizing narratives—for example, painting such elites as Satanic child abusers—sends an unambiguous signal that they share similar attitudes.
Signalling vigilance
In addition, paranoia can also serve the more general function of signalling vigilance for exploitation.
In The Godfather, Don Corleone delivers the following message to an audience of mafia bosses:
“I'm a superstitious man, and if some unlucky accident should befall [my son]... if he should be shot in the head by a police officer, or if he should hang himself in his jail cell, or if he's struck by a bolt of lightning, then I'm going to blame some of the people in this room.”
In one sense, such tendencies towards superstition and jumping to conclusions are irrational. However, one of the central lessons of game theory is that behaviours that would be irrational if one were alone on a desert island can advance one’s strategic interests when interacting with others.
This lesson applies to paranoia. An irrational tendency to jump to paranoid conclusions on minimal evidence can serve a deterrent effect if others know about it. By signalling hyper-vigilance for manipulation and hostility, others learn that they are unlikely to evade detection if they are tempted to wrong you in some way.
Once again, this tendency seems to be widespread in human social life.
For example, consider the jealous husband who is constantly jumping to baseless suspicions about his wife’s infidelity, making himself and his wife miserable. Although the behaviour seems irrational and maladaptive, it can also signal vigilance in ways that reduce the likelihood of infidelity. The message is clear: You are being monitored. You will not be able to cheat on me without me knowing.
Something similar might also play a role in bizarre conspiracy theories. Somebody who acquires a reputation as endorsing paranoid theories about the behaviour of distant elites also sends a clear signal to those in their immediate surroundings: I am the kind of person willing to jump to conclusions about malevolent intentions based on minimal evidence. Don’t fuck with me.
Conclusion
Paranoia can be a strategic tool, not just a psychological error. As demonizing narratives, paranoid fantasies justify hostility and coordinate collective action. As signals, they send unambiguous messages about allegiances and vigilance.
This dual nature of paranoia—as bug and feature of human psychology—helps explain its ubiquity and persistence throughout history. It is both difficult to avoid and desirable to seek out, a symptom of cognitive failure and a rational response to social incentives. In this respect, paranoia reflects the complexity, fragility, dangers, and opportunities of human sociality.
I think this is a good piece overall, but I'm not sure about this part: "Nobody wants it to be true that witches are plotting against them or that sinister and secretive Satanic cults threaten their children.". Doesn't believing that your enemies are black-hatted villains make problems seem simpler, and isn't that something people want? Might not monsters be easier problems to solve than the complex, diffuse issues that most people struggle with? I think it might be especially appealing to believe in the maximized evil of things you hate if it leads to believing that the correct course of action is loudmouthism where you decry the problem but aren't obligated to do anything substantive about it because only collective action or action by people more powerful than you could do anything.
Great post again Dan. With paranoia, it can be tricky. I once wrote in an article, as a joke, that it's methodologically speaking unfortunate that we can't surreptitiously record human phone conversations unless we are a government agency. The editor was furious and asked me to remove this "paranoid nonsense". A week later, Snowden happened...