Wishful Thinking Is A Myth
How social games, not comforting falsehoods, distort what we believe.
Many people believe that human beings have a powerful tendency to convince ourselves of comforting falsehoods. We engage in wishful thinking, confusing our desires with our beliefs. We believe what we want to be true, not what is true.
More generally, we let our emotions distort our mental models of reality, embracing beliefs and belief systems that substitute reassuring myths for harsh realities.
Many also believe that this psychological bias is a significant force in human affairs. For example, it is supposed to explain why people fall prey to “positive illusions” (e.g., self-serving and self-aggrandising beliefs), why they convince themselves of religious fairy tales (the “opium of the masses”), and even why they accept absurd conspiracy theories, which allegedly reduce negative feelings associated with uncertainty and a lack of control.
This hypothesis—call it the “you can’t handle the truth!” model of human psychology—is so widespread that most people don’t even treat it as a hypothesis. It is viewed as a basic datum of the human condition, a powerful bias that might explain other things—self-deception, politics, religion, conspiracy theorising, and so on—but that couldn’t itself be seriously questioned.
For example, Scott Alexander simply defines motivated reasoning as “the tendency for people to believe comfortable lies, like ‘my wife isn’t cheating on me’ or ‘I’m totally right about politics, the only reason my program failed was that wreckers from the other party sabotaged it.’” In a post outlining his preferred explanation of this tendency, he notes that the “question – why does the brain so often confuse what is true vs what I want to be true? – has been bothering me for years.”
In contrast, I think Alexander has been bothered by a myth. There is no powerful tendency in human psychology to confuse what is true with what we want to be true. People do not generally convince themselves of comforting falsehoods.
Admittedly, there are some things in the vicinity of this tendency that are real. For example, we sometimes avoid acquiring or dwelling on information when we anticipate that doing so would be unpleasant, although this isn’t a very significant force in human affairs.
Moreover, I am not denying that motivated reasoning—the tendency for practical motivations and interests to distort our view of the world—is a powerful bias in human psychology. My claim is rather that the “you can’t handle the truth!” model completely misrepresents how motivated reasoning works in most cases.
Put simply: Although people often believe what they want to believe, they rarely believe what they want to be true.
Put another way: We often convince ourselves of falsehoods, but rarely reassuring or comforting falsehoods.
This is because motivated reasoning is driven by strategic, social goals rather than emotional ones. To understand how it works, you must replace the “you can’t handle the truth!” model with the “believing true things is often maladaptive in social games involving persuasion, reputation management, and status competition” model.
In this post, I will:
Describe the problems with the “you can’t handle the truth!” model
Outline a rival social model.
Explain the former’s popularity.
As I will review, the social model is not original to me. It builds on the work of numerous scholars stretching back several decades. My goal is to draw these ideas together into a unifying framework and to highlight its theoretical and empirical support and explanatory power.
I will end by arguing that the “you can’t handle the truth!” model of human psychology is not just mistaken; it is pernicious. It encourages the view that when people accept “harsh” beliefs that they don’t want to be true, they are being rational and truth-seeking—even heroic. In reality, people are often motivated to convince themselves of negative, pessimistic beliefs, and it often takes courage and intellectual virtue to confront positive truths.



