23 Comments
User's avatar
Synthetic Civilization's avatar

The harder truth becomes to verify, the more belief becomes an infrastructure problem.

In high epistemic complexity, people don’t self-deceive in isolation, they outsource judgment.

That’s where legitimacy quietly gives way to legibility and default authority.

Complexity isn’t just a cognitive challenge; it’s a power multiplier.

Wild Pacific's avatar

Lots of substance, good topic.

Had to reread some paragraphs, as it is naturally hard to parse, with no blame on author. The topic is the meta of it all, after all. Dan is great.

Objections. The topic is not resting on its right side, I believe. Which is: usefulness of partial and derivative logic in survival. I’ve not read nowhere near enough philosophy authors that are mentioned. But I am influenced by Donald Hoffman’s “The Case Against Reality” thought school.

Truth, whatever we mean by it, is definitely a direction, not a point. So seeking it is definitely an endless quest, and on this path thru the fractal we (humans, life, maybe universe itself) must optimize.

Funny thing is, low resolution matters a lot! When our bandwidth is low, we totally decrease resolution of our photos, as Mail app helpfully suggests.

For life forms, choosing what stimuli are more important at the moment matters. And when stimuli are overwhelming we decline to choose, allowing ourselves to “coast” instead. Thinking of how bird of prey cannot attack a flock, lacking a target, and retreats.

Which brings me to a main point: exposure to event or source of information does not equal absorption of the information. A person, however smart, will be exposed to something that, due to their lopsided expertise and life path, is too “noisy” and decline to choose, or will choose the lower resolution solution that may have a high chance to be wrong.

This is why academics and nerds suck at fashion, despite having so many choices nowadays — their mind is just preoccupied with other stimuli, most of the time. Only half-joking! ☺️

tl;dr: Epistemic complexity (too much info to process due to other priorities or lower abilities of the host) and Motivated Cognition (this decision feels good) are the same thing. They both spring up from the pressure to maintain state and “survive” and “procreate” as an informational (formerly: biological) organism.

Paul's avatar

I'm struggling with what you mean by epistemic complexity. Let's say there is reality, perceived reality and mediating narratives. The size of networks adds to the access to indirect perceived reality (you know more people who are also perceiving the world). The multiplicity of narratives adds a different type of complexity. A small set of mediating narratives gives you one type of elite control. Alternatively, flooding the space with many mediating narratives likely leads to tribal attractors but there is no control on what those attractors are. Flooding the space of narrative creates a sense of arbitrariness on "correct" narratives since you cannot learn and test them all and a premium finding a network with common narrative as narrative prioritization (relevance realization).

So is the the complexity, that we are perceiving more broadly (live videos of Ukraine and Iran, access to libraries of data) or there are more organizing narratives we come on contact with (panphycists, metamodern spiritualists, corporatist, naive karmic view, MAGA party line...)

Simon Roberts's avatar

Loved this post Dan. Alongside motivated cognition, people are uncomfortable in probabilistic thinking and scenarios. A lot of life is in the margins and there isn’t a definite “correct” answer most of the time.

Kevin O'Neill's avatar

I really like this perspective and will reread to internalize it more.

I'm curious though what you think of something like Dan Kahan's study about motivated reasoning: that people will reinterpret math facts to align with tribal norms, and this effect is stronger when people are better at math. That does seem to show that motivated reasoning supersedes "rationality" when identity is at stake even when the relevant facts are right in front of you.

Dan Williams's avatar

Thanks Kevin. Those studies haven't replicated very well, although even if one takes them at face value, I don't think they're inconsistent with the constrained character of motivated cognition identified here. They would simply show that we're worse at reasoning when identity and motivations are in play, which is true, not that we can believe whatever we want.

Jonah 約拿's avatar

Interesting. I would think that attentional limits play a role here, too. Even if someone were to be in a healthy epistemic environment and sincere about truth-seeking, the sheer amount of information they must process to form and maintain accurate beliefs could prove unsustainable--in which case, zeroing in on a master explanation of some phenomenon, ideally with a master slogan and a delimited set of master concepts, looks all the more attractive.

Cf. Haidt on monomania, but also (and more dear to my heart) Xunzi, who joined other figures in ancient Chinese epistemology in lamenting cognitive obscuration or fixation by limited values and concerns: "Whenever people are knotted up with problems, it is that they are obscured by one twist and become ignorant of the greater principle. With order, one recovers the pattern; with contrary doubts, one falls into confusion. [...] Wherever the myriad things differ, they cannot but serve to obscure one another. This is the common problem for the arts of the mind."

(凡人之患,蔽於一曲,而闇於大理。治則復經,兩疑則惑矣。[...] 凡萬物異則莫不相為蔽,此心術之公患也。" Xunzi HKCS 21/102/5-13)

G Raghuram's avatar

The adjective "rational" is doing a lot of heavy-lifting here. What happens if you interrogate it fully?

Also, (trite example that came to mind): Someone insists "The sun rises in the West". I get him to wake up early and point me to where he thinks the sun will rise from. This is easily verfified - physically, in a matter of minutes, if we wait. Till then, only two Qs arise - is this guy oriented correctly (to the cardinal directions, does he have a Left-Right confusion) or is he just choosing to use different sounds for the usual ones we do? In any case, once the sun rises (or starts showing up) and it matches the direction he had indicated earlier - I should not and don't care any more, what he calls that direction or what he contends. He's okay.

Lastly, I think the Q of complexity needs more working out. There is no reason to assume anyone is being irrational for any belief re: the world, which has little to do with his daily life.

Susan Scheid's avatar

Well, I do believe you’ve hit it out of the park yet once again with this essay. I have to go back through and also check the links, but based on my first read, I have a couple observations/questions:

First: A la George Orwell’s Newspeak, do I see implicit (or maybe explicit?) in your essay the use of obfuscation as a propagandist strategy to create the appearance of complexity and thereby make it virtually impossible to decipher the truth?

Second: Your essay brings to mind George Lakoff’s “worldview” hypothesis. Until very recently (you see how backward I am . . .) I was not aware of his work, but the “worldview” idea resonated. I am finding his level of abstraction a struggle, but here’s where I come out so far:

The family-role framings he used seemed out of date to me, so I challenged myself to re-describe them, using my mother (97-year-old, hard-scrabbling working class, lifelong conservative) and me (76-year old overeducated liberal/left) as “templates”:

Conservative=strong emphasis on self-reliance. This can extend beyond the immediate and extended family to the local community, including, eg, neighborhood, religious, and secular community groups.

Liberal=strong emphasis on interdependence, not only at an interpersonal, but also at a societal level.

With this exercise, I found it quite easy to see the coherence of her worldview, which I do not find at all irrational, even though I disagree.

OK, this is all probably once again way off target of the import of your essay. As always, thank you for pushing us all to think harder and better.

hn.cbp's avatar

This analysis is very strong on why false beliefs are easier to sustain under epistemic complexity.

But I wonder if the deeper problem today isn’t primarily about belief formation at all. In many contemporary systems, people can hold inaccurate, shallow, or even contradictory beliefs — and yet outcomes proceed largely unchanged.

The critical shift seems to be that belief is no longer the main bottleneck where decisions are constrained. Epistemic complexity doesn’t just make self-deception easier; it allows action to close elsewhere — through delegated systems, institutional pipelines, and pre-structured flows — before reflection or correction can meaningfully intervene.

In that setting, the danger isn’t simply that people believe false things. It’s that belief, true or false, arrives too late to bear responsibility. Motivated cognition still exists, but its social impact now depends less on what people believe than on whether belief still has standing at the point where decisions are made.

This may explain why increasing epistemic clarity often fails to restore accountability: the problem is no longer epistemic error alone, but the relocation of agency away from sites where understanding could still interrupt outcomes.

A. Jacobs's avatar

This feels like the psychological layer of epistemic drift. As reality gets harder to model, people don’t just make more mistakes, they get better at defending the ones they want to keep. Complexity creates the conditions where self-deception can scale.

The Scam Doctor's avatar

The harder it is to find the truth, the easier it is to scam people about their health:

https://thescamdoctor.substack.com/p/doing-your-own-research-could-kill?r=6hgshq

El Monstro's avatar

This is great stuff as usual but seems obvious to me. Maybe because it’s I have been reading you so long but more likely it’s due to my naturally skeptical nature. When someone says something my first instinct is always to ask myself “why do they think that this is true?” I have learned not to say it though, because most people take it wrong.

But more importantly I always ask myself “why do I think that this is true” and am always curious how I end up justifying my “naive realism” to myself. This results in me examining my own biases and looking for sophisticated thinkers who disagree with me. Which is what led me to this substack in the first place!

Okay. You made your point. So what? There is an implicit plea here for more self doubt and awareness. But what else? I hope you follow up on this.

Jan Zilinsky's avatar

A reliably excellent and thought-proving essay; will try to simplify a bit (for myself, and maybe for others).

- A low-complexity scenario: You are about to leave the house. The sky is dark grey, making it hard to tell yourself "it probably won't rain". Even if you don't like carrying umbrellas, evidence forces your hand, you take the umbrella. And you tone down the internal "I might lose it", "It will ruin my outfit" talk which would have allowed you to downplay a more ambiguously-colored sky.

- A higher-complexity scenario: Packing for a trip next week, you prefer to be casual and only bring a tiny backpack. You also hate baggage fees. The future is uncertain, you're free to cherry-pick memories, you pack super-light.

A friend then intervenes with information about your destination. You pack more clothes. A possible conclusion: when presented with facts, people update their beliefs rationally. Therefore, motivated cognition isn't that strong. People are reasonable truth-seekers.

The Williams critique: by forcing the information on a person, you weakened the very mechanism that allowed motivated cognition to do its work.

Michael Mohr's avatar

Yep. Very true. Denial. Suppression. Reality often clashes with convenient, simplistic narratives. Happens on both political sides.

Richard Williams's avatar

Brilliant! Now please forward to every wannabe functionaly sentient AGI you know. Thanks.

H Grumpy's avatar

“. . . highly selective, low-resolution compressions of reality.” Epistemological complexity might have another implication here, when truth requires not swapping out a false ‘fact’ for a true one, but coming up with a higher resolution model of reality that can reconcile both. Social reality in large, diverse societies may require models too complex for the average voter (or any individual) to hold.

Also, I’d add anger to your list of motivated-cognition responses to incompatible facts. It’s probably muted in academic contexts by norms of argument, but for most people there is some set of beliefs that is defended at a pre-conscious, emotional level before cognition has time to kick in.