There is no "woke mind virus"
The ideas you dislike—whether wokeism, religion, or misinformation—are not “mind viruses” and do not spread via contagion. This framing seeks to demonise, not understand, and poisons public discourse.
Parasites of the mind
“In the same way that brain parasites have evolved to take advantage of their hosts in the furtherance of their evolutionary objectives, parasitic viruses of the human mind (devastatingly bad ideas) function in a similar manner. They parasitize human minds, rendering them impervious to critical thinking, while finding clever ways to spread across a given population (for example, getting students to enroll in women’s studies departments).”
So writes Gad Saad in The Parasitic Mind, a book-length complaint about “mind viruses” like “postmodernism, radical feminism, and social constructivism”, all of which are “bound by the full rejection of reality and common sense.”
Saad is not alone in his analysis of “woke” ideas. In the past several years, talk of the “woke mind virus” has itself gone viral among right-wing culture warriors. In a recent interview, Elon Musk alleged that the mind virus had even killed his child. What he meant was not that it had actually killed his child—they are very much alive—but that his child is transgender and so, apparently, dead to him.
According to Musk, he was “essentially tricked into signing documents” allowing his child to take puberty blockers, which “are actually just sterilization drugs.” This experience radicalised him: “I vowed to destroy the woke mind virus after that.”
Mind viruses everywhere
Characterising ideas as “mind viruses” is not new. Since 2016, it has been the favourite metaphor adopted by the opposite side in the culture war to characterise online misinformation and conspiracy theories.
Bad ideas, writes philosopher Andy Norman,
“have all the properties of parasites. Minds host them, the way bodies host bacteria. When bad ideas spread, they replicate—copies are created in other minds. An idea can even induce its host to infect other minds, just as the flu virus can induce an infection-spreading sneeze. Finally, bad ideas are harmful almost by definition.”
Similarly, misinformation researcher Sander van der Linden writes,
“There are viruses of the mind and not just the biological kind… Misinformation, conspiracy theories, and other dangerous ideas, latch on to the brain and insert themselves deep into our consciousness… Misinformation can fundamentally alter the way we act and think about the world. The misinformation virus hijacks parts of our basic cognitive machinery.”
At least since Richard Dawkins first introduced the concept of “memes” in the 1970s, this framing has also been influential in how many people analyse religions.
“Mind viruses” do not exist
Ideas, including bad ones, are not infectious mind viruses. This metaphor rests on an inaccurate picture of human psychology and social behaviour that functions to demonise, not understand. Because of this, it poisons public debate, increases polarisation, and hinders our collective capacities to understand the world and each other.
I will make three general points:
The "mind virus" metaphor assumes the truth is self-evident, so false beliefs must stem from irrationality. This neglects how people form beliefs based on different information, trusted sources, and interpretive frameworks, which means rational individuals can easily develop radically divergent worldviews.
People often embrace and spread ideas because they serve practical goals beyond truth-seeking. For example, religious, ideological, and conspiratorial narratives often serve propagandistic functions or promote people’s social interests. Such motivated reasoning looks nothing like the passive infection by “mind viruses”.
Belief systems do not spread via simple contagion. They are maintained through complex social dynamics and incentives in which members of belief-based tribes win status by enforcing, rationalising, and spreading bespoke realities.
1. The truth is not self-evident
Those who characterise ideas they dislike as “mind viruses” do not think their own ideas are mind viruses. For example, according to figures like Gad Saad and Elon Musk, wokeism is an infectious mind virus, but anti-wokeism is not. Likewise, Richard Dawkins argues that religion is a “virus of the mind”, but scientific ideas are not, and misinformation researchers do not explain the popularity of their research in terms of social contagion.
Given this, to establish that something is a mind virus, it is not enough to point out that it “spreads” from one person to another. That is true of virtually all ideas.
Humans are social animals. We influence each other in countless ways via communication and social referencing, and most of what we believe in domains like politics is based on information—evidence and arguments—we acquire from others. This is just as true of anti-wokeism as wokeism, for example.
What, then, distinguishes the good ideas from the mind viruses?
One leading argument is that mind viruses bypass people’s rational faculties or critical thinking. In other words, good ideas spread because they have positive epistemic qualities (evidence, plausibility, coherence), while bad ideas spread despite lacking these qualities.
Naive realism
This argument involves a classic example of naive realism.
Naive realists think “the truth” is self-evident, which means anyone who does not see the truth—who disagrees with them—must be evil, irrational, or brainwashed. As Walter Lippmann observed,
“He who denies my version of the facts is to me perverse, alien, dangerous. How shall I account for him? The opponent has always to be explained, and the last explanation that we ever look for is that he sees a different set of facts.”
This attitude underlies the “mind virus” discourse. The motivating idea behind this way of talking is: “My views are so self-evidently correct that the only explanation of why anyone would disagree with them is brute irrationality.”
Naive realism is a mistake.
The relationship between “the facts” (what Lippmann called the “real environment”) and the ideologies we form to explain those facts (the “pseudo-environment”) is filtered in countless ways by what information we attend to, which voices we are exposed to, whom we trust, and how we interpret the world. Due to this heavily mediated process, our ideologies are always highly selective, low-resolution pictures of a vast, complex reality. Given this, rational people can easily end up with wildly divergent worldviews simply because they were exposed to different information, trusted different sources, focused on different facts, or interpreted such facts through different explanatory frameworks.
Wokeism
To make this more concrete, consider wokeism. Although there is endless disagreement over what “woke” means, it is easy to point at the kinds of ideas typically associated with it: Western societies are structured to benefit some groups (e.g., heterosexual white men) at the expense of others (e.g., racial minorities, women, queer people, etc.); historical oppression continues to impact marginalised groups; “systemic” and “structural” discrimination and “implicit biases” perpetuate unjust inequalities; intersecting forms of oppression involve complex interaction effects; “neutrality” (i.e., failing to endorse and support woke activism) perpetuates social injustice; and so on.
It involves a severe failure of imagination to think brute irrationality or mindless conformism are the only explanations for why many people find this package of ideas appealing.
For example, there appears to be considerable evidence for wokeism in the form of shocking disparities in outcomes (income, education, health, political influence, etc.) between different demographic groups. Experts have also conducted countless historical, sociological, psychological, and philosophical studies explicitly justifying and supporting woke ideas. Moreover, many impressive, high-status, seemingly trustworthy people repeatedly endorse or assume a woke worldview.
None of this means that wokeism is “true”. Like other popular ideologies, I think it contains a mixture of truths, insights, omissions, exaggerations, obfuscations, and misleading explanations.
However, only a naive realist would draw a tight connection between rationality and truth. Even if you think wokeism is deeply flawed, your reasons for this assessment—your contrary evidence, alternative explanations, and criticisms—are not built into the visible furniture of reality, accessible to direct observation if only your ideological enemies would look. They are constituents of your pseudo-environment, constructed from your distinctive history of learning, reasoning, and trusting.
2. Motivated irrationality
Of course, I do not mean to imply that everyone is always perfectly rational. For example, when Elon Musk recently declared that “civil war is inevitable” in the UK during riots involving a fringe group of fanatics and thugs, that was extraordinarily stupid.
Nevertheless, Musk’s periodic outbursts of stupidity are not illuminated by positing a right-wing “mind virus”. Instead, they are symptomatic of the more mundane fact that he is a petty, emotionally incontinent edgelord who seeks validation from right-wing culture warriors online.
This touches on another problem for the “mind virus” analysis: even when systematic failures of rationality are relevant to the adoption and spread of beliefs, this is not because people have become passively infected by mind viruses. It is typically because people have other goals and interests distinct from the disinterested pursuit of truth.
Religion
Consider religion. From a purely epistemic point of view—from the perspective of exclusively caring about the truth—many (although not all) religious beliefs are difficult to understand. Richard Dawkins is correct about that. However, it does not follow that religious beliefs are memetic mistakes spread without concern for people’s interests and goals.
Instead, research increasingly suggests that many religious belief systems are social technologies gradually crafted and refined over time by strategic agents driven to (i) encourage others to be more moral and (ii) signal their own moral commitments. This is why most religious belief systems posit supernatural agents (deities) and forces (karma) that monitor and incentivise moral behaviour. It is also why religious communities often moralise commitment to shared religious beliefs, treating them as sacred orthodoxies, creating duties to affirm, protect, and spread them, and punishing blasphemers and heretics who threaten them.
Given that the function of such ideas is not truth, it is unsurprising that religious people often treat their beliefs in ways that seem irrational from a narrowly epistemic point of view. However, underlying such epistemic irrationality is an important social rationality, which might explain why religious people are often happier, healthier, more prosperous, and more fulfilled than atheists.
Ideologies
Similar lessons apply to political ideologies. Although some people adopt ideologies solely because they are persuaded by evidence and arguments, practical goals and interests are also often important.
For example, ideologies often serve propagandistic functions: they feature ideas and narratives that specific segments of society benefit from spreading. When that happens, the beneficiaries of such propaganda can become highly attached to the ideologies for reasons independent of their truth.
Similarly, endorsement of certain ideologies can perform social signalling functions. In many contexts, embracing and affirming certain ideologies signals that the believer has attractive qualities or displays allegiance to specific individuals and groups. Once again, when people benefit from sending such signals, they often prioritise commitment to the ideology over pure rationality.
Of course, in many cases, the connection between self-interest and ideology is transparent. It does not take much insight to see that kings benefited from widespread belief in the divine right of kings, for example, or why dominant white populations throughout recent centuries were strongly attracted to white supremacist ideologies.
However, self-interest plays a role in the endorsement of most ideologies. For example, wokeism aims to allocate more status and power to members of traditionally marginalised groups, especially in high-status professions. Given this, members of such groups in these professions clearly benefit from spreading it.
Similarly, educational polarisation and other forces have created a situation in many Western countries in which university-educated white professionals form part of a left-wing political coalition with racial minorities. This means that the general motivation to promote ideologies that benefit one’s allies might explain this group’s attraction to woke ideas, coupled with the fact that endorsing highly progressive views provides a way of distinguishing themselves from non-college-educated, working-class whites.
Of course, these cynical explanations are never the entire story. Sometimes, they are not even the main story. As noted, many people are likely attracted to woke ideas simply because they have been persuaded by evidence and rational arguments. However, the conviction that one is a crusader for social justice can drive its own forms of motivated reasoning. If the endorsement of an ideology signals that one is a good person, people’s intense desire to be seen as a good person can drive them to embrace the ideology with unusual fanaticism.
Whatever the cause, the role of motivated reasoning in religions, ideologies, and even popular conspiracy theories reveals how misleading the “mind virus” metaphor is. People do not want to spread or become infected by harmful parasites, yet they often actively seek out and embrace irrational ideas. Moreover, these ideas do not have interests of their own. They are crafted and refined in ways that advance people’s practical goals and agendas.
3. Belief systems do not spread via contagion
Finally, the preceding analysis focuses on individual psychology. However, you cannot understand religions, ideologies, or popular conspiracy theories without engaging with the social dynamics that underlie their emergence, maintenance, and transmission.
Once you focus on this social dimension, it becomes even more apparent how misleading the “mind virus” metaphor is.
When it comes to popular belief systems people treat as mind viruses, they do not spread via mere contact or “exposure” among hapless victims. Instead, they are shared among committed communities of co-believers, belief-based tribes that coordinate in complex ways to produce, protect, and propagate bespoke realities.
First, creating and maintaining identity-defining narratives often requires hard work. Although individual-level motivated irrationality plays a role, this work is usually outsourced to complex forms of social scaffolding. Belief-based tribes transform shared beliefs into sacred orthodoxies, enforce norms against challenging them, and encourage group members to communicate supporting evidence and arguments. Consequently, the community is sheltered from exposure to counter-evidence and criticisms from those they trust.
Second, smart people can win status within communities of co-believers by investing time and ingenuity into defending and justifying preferred conclusions. For example, even among QAnon believers, so-called “Bakers” compete for community clout through forms of intellectual labour “designed to construct specific facts and theories that maintain QAnon’s cohesion over time.” Such epistemic status games are better analysed in economic terms than epidemiological ones. Rather than gullible victims infected by contagious ideas, a marketplace of rationalisations directs intellectual energies to where they win the greatest returns.
Finally, belief-based tribes often reward what Will Storr calls “active belief”, passionate displays of commitment to shared realities socially rewarded—and hence incentivised—by fellow group members. Active believers are true believers. They evangelise, proselytise, and propagandise. They form energetic mobs that punish or “cancel” dissenters, heretics, and apostates. And they make costly sacrifices to signal sincere devotion to “the cause”.
As Eric Hoffer observed in the 1950s, the striking thing about such forms of fanaticism is their emergence within communities and movements with very different—sometimes diametrically opposed—beliefs. For example, cancel culture dynamics characterise ideological communities across the political spectrum and diverse religious groups throughout history. This pattern becomes less surprising once you understand how the identity-defining narratives of belief-based coalitions create systems of social rewards and punishments that encourage such behaviours. In contrast, appealing to the idea of contagious mind viruses explains nothing.
Why this matters
At best, the “mind virus” metaphor provides a pointless redescription of something we already knew: that humans influence each other and sometimes adopt false beliefs.
At worst, it distorts our understanding of beliefs, psychology, and society. It replaces a complex reality of perspectives, rationality, agency, self-interest, and social coordination with a self-serving, stick-figure cartoon.
Why does this matter?
First, the metaphor poisons public discourse and exacerbates polarisation. The depiction of people as hapless victims of brain parasites functions as a demonising narrative that makes productive disagreement impossible.
Second, it reflects and encourages intellectual complacency and arrogance. If rationality played no role in the emergence of other people’s beliefs, there is no reason to engage with them rationally. Once you understand that this is wrong—that people have epistemic and practical reasons for their worldviews—you should take their views and interests more seriously. Moreover, if people just like you—not gullible victims of mind viruses, but ordinary, rational people—could embrace opinions you think are wrong, you should become more open to the possibility that your opinions are wrong.
Finally, suppose you think religion or wokeism or anti-wokeism or conspiracy theories or whatever else are profoundly mistaken and harmful. If you treat those who endorse them as victims of mind viruses, you will not understand how to address or combat those ideas.
For example, most people significantly undervalue the possibilities of rational persuasion. Contrary to conventional wisdom, people can typically be persuaded by rational arguments. This conventional wisdom likely reflects the same mistaken intuition underlying the mind virus metaphor: that rationality plays no role in giving rise to beliefs one strongly disagrees with.
Moreover, once you understand that there is often a hidden practical rationality underlying many popular belief systems, this encourages more possibilities for rational intervention. For example, you might try to change those features of society that make destructive belief systems highly attractive to people. You might also try harder to safeguard norms and institutions (free speech, viewpoint diversity, academic freedom, etc.) that protect societies from belief-based tribes focused on spreading and enforcing their preferred narratives.
There are no easy solutions here. However, you will not be able to develop any solutions if you misunderstand the nature and causes of the problem.
While I largely agree with your reasoning regarding the formation and spread of beliefs, I wonder if you might be arguing against a straw man. I'm not suggesting that the mind virus concept is always used in good faith, or that it's particularly useful, but I do think there's more truth to it than you acknowledge.
First, if we accept that memes exist, they don't contradict your more nuanced understanding of belief formation and propagation. Rather, they offer a complementary explanation based on evolutionary theory. Memetic theory proposes that memes, like genes, undergo a process of natural selection in which the most adaptive, appealing, or easily replicated ideas thrive and spread. Those that do not go extinct. Importantly, memetic theory doesn't tell us WHICH traits are most adaptive, only THAT they are selected like genes. This is where your social-motivational theory comes in. The memes that are most adaptive and appealing are those that align with the social motivations and functions you describe.
Second, while one could argue that "mind viruses" are just a rebranding of memes, Gad Saad highlights why certain memes are more resistant to rational critique than others. He suggests that some ideologies have stronger "epistemological immune systems" against rational scrutiny, promoting dogmatic thinking in which beliefs are accepted as absolute truths and become self-reinforcing. This doesn't contradict your theory; it complements it. Your theory explains what motivates people to engage with certain ideas, while Saad's framework addresses why some ideas are more resistant to reasoning.
Finally, the hijacking of these concepts by figures like Elon Musk for their own purposes, whether consciously or not, actually supports both your theory and memetic theory. The anti-woke narrative, for example, is a highly adaptive meme and arguably a mind virus in itself, much like other ideologies that dominate discourse. Similarly, your own ideas can be considered memes in an evolutionary sense—appealing and persuasive to certain individuals not only because of their intellectual merit, but also because they fulfill certain social functions and motivations. However, your theory does not qualify as a mind virus because it does not constitute an ideology that shuts down rational discourse.
I would be very grateful if you could tell me what you think, where and if I'm wrong, and how I can improve my argument as I try to improve my writing. Also, I would like to ask how best to refer to your theory? What do you think is the most accurate way to label it?
But as long as “imitation of high status individuals” is a human epistemic mechanism, you have self reinforcing dynamics, and “contagion” is a reasonable word.