75 Comments

I agree with everything you said, and yet I still feel that there must be something we can do to stem the harmful kind of misinformation that is not censorship.

For instance, on Substack everyone can say whatever they want and the most outrageous thing someone can say may be read by the people who follow them, but it won’t reach the rest of us who aren’t in that sphere. If they do pop into our spheres somehow, they are easily blocked or reporters by readers. Most likely that community will stay small and/or blocked by people who don’t want to see it and thus it won’t gain traction the way outrageous things do on other platforms. We can self regulate our own little communities without needing someone above us to make that choice for us. This seems a better way to design.

As for presidential candidates, I’m not sure the right answer. Again, it’s not censorship, but perhaps there is something else. Like independent writers who have gained trust for taking a bi-partisan view. Who can research the answers and share their own independent fact checking with an audience that doesn’t trust Fox News or CNN to do it. Less partisan media sources might be a good solution.

I agree that misinformation has always been an issue, but it has also adapted for our times, and that means adapting how we handle it. Censorship is not the answer, but I think there are other things that could be and it’s worth coming up with those solutions.

Expand full comment

I agree with all this - thanks for an insightful comment!

Expand full comment

There are also other nuances .. for example I firmly believe that most of the most outrageous comment section posts that are not made by bots are false flag posts by opposing teams in order to validate stereotypes ... e.g. the Jussie Smollet gambit.

Expand full comment

Tangle is an example of considered independent journalism. Interesting. But it has no reach and no impact.

Expand full comment

I think Tangle is interesting, but I just don't think it's that good. I also think we need to think outside the lines of "here's what democrats think" and "here's what republicans think." Because I think a lot of us don't follow those hard lines. You hear people say things like "I'm socially democrat, fiscally republican." Well what does that mean? I think we should explore those lines more!

Expand full comment

It means they like sex, drugs, rock-and-roll, but not unions, public services, regulation of business. They don't want Jesus in government, but they're fine with Mammon. As Freddie deBoer put it: "21st century liberalism is ensuring a panel at a defense industry conference called Building a Deadlier Drone has adequate gender diversity."

Expand full comment

We are in some sort of epistemic crisis. The "experts" are claiming that misinformation is our greatest threat while at the same time public trust in many institutions is declining. For me, the issue seems to be that there is little (if any) accountability for when these institutions mislead the public, much like there is little accountability for scientific fraud. Unfortunately, this incompetence opens the door for misinformation. People want to go back to a world where these institutions are trusted again, but they don't deserve our trust and they don't seem willing to change or even acknowledge that they are part of the problem.

We are getting to the point where there simply are no universally trusted sources of truth in our society.

Expand full comment

I agree these issues of trust are complicated. For all their faults, I think mainstream institutions (mainstream media, science, public health) are still by far the best we've got and don't warrant absolute distrust - however, I also agree that it's really important to build trust in them.

Expand full comment

I'm hopeful that we will soon be able to create AI agents to help with this - searching out fraud and misinformation in mainstream institutions. Make it more difficult to get away with deception without people knowing about it.

Expand full comment

Agree.

It is a matter of experience and discernment *which* institutions have earned trust, rather than none of have it.

Also, some institutions have earned trust in some matters, but distrust in others. Do I trust my U.S. government to effect natural disaster relief? Largely, yes. Do I trust them to responsibly shepherd our $35 trillion sovereign debt? Or our tens of trillions of unfunded future liabilities for entitlement programs? Hell no.

(But that doesn't stop the vast majority - tens of millions of people - from *habitually* voting for the same irresponsible parties and politicians. Draw your own conclusion about the intelligence of most people here.)

Expand full comment

I disagree. We are not in al epistemic crisis because the value and quality of knowledge made available to us by experts has not really changed.

What has changed is faith, by some elements of society, in that value and quality. We are now being subjected to a massive online enabled attack on expertise.

There have always been actors really to undermine faith in such expertise. But now that have unprecedented reach and because it is in many cases literally their business they have become expert in building audiences.

This is the process that is undermining public faith.

You say that they're is no accountability if 'these institutions' mislead the public'

But there is. Academic fraud is not without consequences. Once revealed there are savage consequences for the actor. The whole process is scientific progress is one of error correction. Businesses shown to have maliciously misled the public (tobacco, lead in petrol. .) are typically not terminated if their core product is key to the modern world but massive lawsuits or brand damage do result.

Let's take a very concrete question. Given that there is NO scientific evidence supporting the attack on vaccines and that the danger to past and future public health is real and serious how are we to respond to this social pathology?

Expand full comment

I'm failing to see consequences for CDC, FDA, mainstream media, intel agencies and chiefs, presenting misinformation and singing the Democrat tune.

Expand full comment

This isn't a serious point

Expand full comment

You said there's no problem, because there's accountability in our systems. But I'm not seeing it.

Expand full comment

Precisely

You'renot seeing it

And part of the reason for that may be because you see the world through a lens that sees all such actors singing 'the Democratic tune'

Think about it

Do you really imagine that are organized to operate together in such ideological harmony?

Have you ever worked in or worked with such actors? Their lives are complicated enough and confused and noisy enough to make the practical chances of them acting in such concert with other actors vanishingly small. Their biggest problem is frequently the unintended consequences of policies because policy making in a complex modern society is incredibly hard.

Is there corruption? Yes. Ideological capture? Yes. But is it as universal and as ideologically directed as you seem to believe? Emphatically not

Expand full comment

As far as the press goes, they're not necessarily organized enough, they're lazy and partisan and happy to take part press releases as their talking points with no verification and minimal paraphrasing. As far as govt agencies go, they are also captured, or let's say, unduly influenced by either partisanship, a drive for conformity, big money from outside interests, or some combination of these.

Expand full comment

"Academic fraud is not without consequences. Once revealed there are savage consequences for the actor."

Unfortunately, there aren't many consequences at all:

https://www.vox.com/future-perfect/368350/scientific-research-fraud-crime-jail-time

https://www.wsj.com/health/healthcare/cancer-study-retracted-research-fallout-9573f842

https://chris-said.io/2024/06/17/the-case-for-criminalizing-scientific-misconduct/

https://www.science.org/content/blog-post/fraud-so-much-fraud

"Given that there is NO scientific evidence supporting the attack on vaccines and that the danger to past and future public health is real and serious how are we to respond to this social pathology?"

Hold scientists criminally responsible for fraudulent research. Rebuild public trust so that the people believe there will be consequences if scientists deceive them.

Expand full comment

It's pretty funny that you want to hold scientists criminally liable for fraudulent research but are not recommending holding those who promote dangerous bs criminally liable? RfkJr is way more dangerous than the overwhelming majority of academics as are a host of other influencers. But shutting these people down is censorship whereas imprisoning unethical researchers is what?

Expand full comment

Doubtless there are cases you can cite where the consequences are not a severe as you feel are warranted. But that is very different from an institutional disregard for academic fraud

As for holding scientists criminally liable for fraudulent research that is a major change in the law that would require some more serious analysis than I can hazard a take on here.

Bear in mind that whilst many of the criticisms of the science are worthless criticisms of promoted policies may not be. In the fog of war with fast changing data, it a real dearth of data, recommendations can be made which may indeed lead to mandates which later results suggest were mistaken. Do you want to hold policy makers criminally liable for policy errors made in the midst of a crisis with poor data?

Expand full comment

Agree with this. In an Irish context if public institutions want to combat misinformation there is a myriad of information that should be proactively published rather than subject to Freedom of Information requests and the original spirit of the law embraced.

Expand full comment

"Nevertheless, I ...will criticise the views of people like Matt Taibbi, Michael Shellenberger, Glenn Greenwald, and Jacob Siegel." Dang...Go Dan!

Expand full comment

I remember reading influence from Caldini he presented experiments that made people seek information that had been censored more aggressively,and also increasing the probability of believing said information is more likely to be true.Seems relevant to the discussion,also was skeptical of Florida 'book bans' despite leaning to the political right myself

Expand full comment

What books did Florida ban?

Expand full comment

Good piece again.

Am observation.

Not all misinformation is alike not are the consequences of disinformation alike.

A tricky case is that of vaccines.

There is presently an enormous amount of monetized disinformation in this space. And this is not without consequence.

The high level argument is

Vaccines are dangerous - Worse than any alleged disease

Don't get vaccinated

Any arguments that you get vaccinated is disinformation

Any regulations are oppression

The bottom line is that this is all nonsense and to the degree it is taken seriously it introduced real dangers into the host society.

Sam Harris presented a widely misunderstood argument here to justify mandatory vaccination.

1 there is a new much more dangerous virus

2 it is incredibly lethal and highly infectious and contagious for all age groups

3 there is highly highly effective vaccine

4 public health concerns would override personal freedoms and mandatory vaccination would once again be necessary and justified

5 sometimes individual freedoms and choices need be overridden

6 not every such mandate is an example of operation by the captured government pharma complex

What of this?

IMHO this makes sense. There are certainly tribal sides here but one side is informed and the other is misinformed. One side has information and trusts authoritative authorities and the other replies in misinformation and in many cases ignorant grifters.

Doubtless there are cases where the cure, censorship, is worse than the alleged danger. But there are also cases where this is not the case. Public health is one, national security is another

I am concerned that is the case to calm what is likely overwhelmed hysterias about misinformation we lose sight of the fact that there really is misinformation, that technology has amplified I'll informed it dangerous actors and that the consequences of their work can be serious.

RFK is a menace with a huge and growing following and a bullhorn. I guess one solution is to wait for the next epidemic to take his unvaccinated as out.

Expand full comment

Fair enough - good points.

Expand full comment

You have a lot of credibility on this issue so your take on the alarmism of the censorship-industrial complex will be effective; looking forward to it

Expand full comment

"there’s no reason to think social media has made them worse." *no* reason? That phrase will make your talk offensive and unpersuasive. Use a softer phrase, like "misinformation has generally not penetrated beyond a fringe, and research shows that with social media this is still the case."

Expand full comment

Thanks Arnold - good point. I added more nuance when I gave the speech.

Expand full comment

But this - "beyond a fringe" - is just false, unless you consider the Republican Party a fringe. Which, granted, a lot of people do. But even if so, that's still a pretty big fringe.

It may not be the _de jure_ Republican Party platform plank that the 2020 Presidential election was fraudulent and stolen. But it's pretty much that _de facto_. And it really is, at least to me, significantly concerning regarding what'll happen in the 2024 election. Maybe we're not going into Civil War II. But that shouldn't be a "maybe".

Expand full comment

Right - but it's important not to bundle together online misinformation like fake news with Trump's lies, which overwhelmingly reach people via mainstream media, not social media. But I agree this is important nuance. The way I would put it: let's place the blame on elites who spread self-serving falsehoods and propaganda, not on social media platforms.

Expand full comment

Let's take a concrete example - Sam Harris's analysis of what is to be done if there is a new but much more dangerous virus but which happily does respond to vaccination.

How are the authorities fighting a potentially catastrophic public health challenge to respond to a raft of crazy conspiracy theorists challenging each and every proposed public health measure?

It's all very well arguing that the best way of dealing with bad information is good information and the truth will out but what if millions are dying? What if the antivaxxers led by such ill informed crackpot grifters as the Weinsteins and Rfk jr are making matters catastrophically worse? How do we deal with their tsunami of misinformation!? No facts will change their position.

Expand full comment

Well, this goes back to what I keep saying about knocking down the weakest arguments. The steelman pro-censorship argument about "overwhelmingly reach people via mainstream media, not social media" isn't "The reach of mainstream media is much more than social media - there's nothing we can do (except finger-wag), we're beaten, we give up.". Instead, it's more along the lines of "Let's do what we can with social media on that particular front, get any gains where possible, of course there's other fronts such as the mainstream media". The idea is that the owners of social media platforms should take a strong active role in being a _part_ - a _part_ of, not the entire - the "misinformation" solution, via suppressing it wherever possible (which of course will never be 100%).

This is some of what I mean that dedicated anti-free-speech'ers have thought about the issues. Again, if one says to them "Getting control of Twitter/X won't help much, because there's Fox News and its ilk and so on", they don't reply (allow me some humor here) "Wow! I never thought about that! Thank you for explaining to me that there's a whole right-wing racist sexist Nazi fascist ecosystem out there. It's a revelation to me. I am humbled by your brilliance, and will cease my activism immediately."

Instead they might say something like roughly: "Today, Twitter/X - tomorrow, the world!" (that's if they're being polite).

Seriously, given the changes from Elon Musk, there does seem to be a least a plausible argument that those policies matter significantly (which, once more, is not to claim it's dispositive to utopia).

Expand full comment

I am new to your substack so possibly you have written about this at length but I’d be interested in your views on UnHerd and their problems with the, er, Censorship Industrial Complex.

And I see your point about 6 minute speeches. An hour on BarPod much more useful!

Expand full comment

Will be writing about that in a future essay - it's a complicated issue so want to get it right.

Expand full comment

Although it sounds like you are getting bored with this topic, I'd be interested in reading your critique of Taibbi et al., and also your review of DiResta's new book (which you indicated that you were working on earlier).

I think that they are connected. I haven't read DiResta's book yet, but I've been listening to her book tour on some podcasts: ( https://player.fm/series/the-gist-2355375/defaming-diresta-for-defining-disinformation ) and ( https://www.thefire.org/news/podcasts/so-speak-free-speech-podcast/debating-social-media-content-moderation )

In her interviews, she spends most of her time defending herself from critics like Taibbi and complaining about how crazy they are. And a lot of the stuff they criticize her for -- that she is an agent of the CIA or whatever -- does sound crazy.

But talking about her crazy critics distracts from more serious (in my opinion) analysis and criticism of her work. In her interviews, she comes across as a mild, unambitious, empirical researcher just trying to gather information about what people are saying online. Then I read the grandiose, alarmist title of her book "Invisible Rulers: The People Who Turn Lies into Reality" There seems to be a disconnect.

Expand full comment

A big problem is the substantive critique behind the "CIA agent" nonsense can't survive in the wild. I get what these people are talking about, there's a whole leftist literature about ruling-class manufacture of what's "true". Noam Chomsky started out talking about this during the Vietnam War. But nobody wants to read it except real leftist intellectual types, and they're rather thin on the ground. If you want to pull in the attention-bucks, it turns into "The CIA is plotting to mind-control psyop you!".

And to be fair, inversely, you really can't climb the public-intellectual ladder on the other side without doing something of the sort like "The MAGA's Are Coming! The MAGA's Are Coming!".

Expand full comment

"Mankind have a great aversion to intellectual labor; but even supposing knowledge to be easily attainable, more people would be content to be ignorant than would take even a little trouble to acquire it."

- attributed to Samuel Johnson, 18 September 1709 [OS 7 September] – 13 December 1784, English writer who made lasting contributions to English literature as a poet, playwright, essayist, moralist, literary critic, biographer, editor, and lexicographer

When considering your question of how much misinformation plays a role in the public reaching poor conclusions, one must consider the likes of Johnson's quote.

How much does humanity's typically minimal desire to acquire relatively accurate and complete information weigh on poor conclusions, versus misinformation itself causing poor conclusions?

When subjected to misinformation, do we have a desire to acquire more information to more accurately balance - or even negate - misinformation? Or are many people too intellectually lazy, and just settle for the misinformation?

And extending that thinking, how much do our irrational biases play in evaluating and reaching poor conclusions when we *do* see relatively complete and accurate information? Do we *prefer* the misinformation? What important and accurate information do we minimize or discard due to our irrational biases?

Further, how much does misinformation cause poor conclusions, versus how much does groupthink and tribalism cause poor conclusions?

Your posts on not ascribing too much guilt to misinformation are good food for thought. The only question is, how *much* does misinformation cause poor conclusions in the public sphere?

There are major causes of poor conclusions other than just misinformation. A case can easily be made that most people *want* to be misinformed on various important topics.

For just one example, scientific information is widely available, yet billions of people still subscribe to various ancient mythologies. Their ancient texts may be just wonderful stories, but doesn't the way so very much of the public interprets, contorts, and exalts them lay more blame on the beholders than on the texts? Because these billions of people wish to ignore "...easily attainable..." contradictory scientific consensus?

Expand full comment

All good points and good questions!

Expand full comment

Thanks for posting this. A few quick reactions, as I do have some thoughts:

I believe the "censorship industrial complex" grifters are way too extreme - but they have a point, and I suspect that would be widely recognized in liberal circles if the shoe were on the other foot (Republican officials doing this). In fact, given Elon Musk's antics, and a possible Trump win, we just might see a huge reversal soon in the current dominant liberal perspective about the idea of government/media-oligarch "cooperation" (I saw something similar happen many years ago on network policy, and it was amazing to me).

" ... driving people to make damaging decisions, like voting for demagogues and rejecting public health advice"

I think there's more than a "grain of truth" here. It's more like separating wheat from chaff. There's a lot of chaff. But also a lot more than a few kernel of wheat.

"... no reason to think social media has made them worse." - well, there's plenty of reason to think social media is an overall bad part of a whole ecosystem which has *arguably* gotten worse. Stuff like outbreaks of measles is a real milestone.

"online fake news makes up only 0.15% of Americans' daily media diet".

I find this a horribly ill-considered statistical argument. It's something like:

"Some academics have tried to create a panic about children being poisoned by lead paint. They cite scare statistics about an enormous number of gallons of lead paint which has been used in housing. However, only the tiniest amount of those gallons turns into peeling paint chips. Almost all of the paint stays on walls. Moreover, considered as a percentage of material in a house - wood or brick, plaster, etc - the paint chips are an infinitesimal amount, maybe 0.15%. And still further, even if children do eat lead paint chips, that's almost always minuscule amount of the total volume of what they consume. A study of parents which asked them to keep a dairy, found few recalled seeing having their children eating paint chips. I suggest these academics are using a moral panic over lead paint and children as an excuse to avoid discussing the moral degeneracy of our society!"

Now, it's not that I disagree with the overall thrust of the points you're making. But I also know the pro-censorship argument, and the objections here are once again to the most simplistic phrasing. Perhaps you couldn't do more in the time you had. However, one response would be that they aren't saying censorship is a cure-all, but rather one very necessary part of a broader social program (to be clear, this social program terrifies me). Some "woke" advocates and similar, really do think about this.

Expand full comment

Fair enough. But notice the point about low prevalence of online misinfo is combined with the fact it overwhelmingly preaches to the choir (a distinct point), and my objection to censorship isn't that it's not a cure-all but that it will likely exacerbate problems in most cases.

Expand full comment

I would have preferred you had given the other speech and it was a shame you weren't able to elaborate more on your points. I was there and found the session quite odd. Five panelists playing down misinformation (I have no problem with your viewpoint essentially), but a couple of them suggesting misinformation was merely censorship (and Fraser Miles misrepresenting things like BBC Fact Check). The irony was that an audience member disagreed with one of them and he angrily responded telling her she was guilty of misinformation! I nearly fell off my seat. And so much for free speech allowed ;)

About the narrow fringe who seek out misinformation that aligns with pre-existing beliefs, I would just say that people often get pulled down rabbit holes through one issue (gender ideology, for example) and on social media then end up being pulled into other stuff, two-tier policing, obsession about prevalence of Muslim grooming gangs, vaccine stuff, political conspiracies of all kinds. I have watched this happen in people I follow and it has led them down some very crazy conspiracy paths including that the Democrats are influencing the weather to defeat Trump.

Expand full comment

Thanks yeah - I should have done a better job on the panel. Not a great fan of public speaking or events like that because in general I find it difficult to think on my feet. I'll be publishing an essay critical of claims about censorship industrial complex at some point in the next couple of weeks. Re. Your point about rabbit holes: I agree that can and sometimes doe happen, but think it is pretty rare, at least among ordinary people who aren't already extremely online. And part of what's going on is that people are getting recruited into a political tribe organised around a set of seemingly unrelated beliefs, and some of these tribes have weird anti-establishment worldviews that involve lots of falsehoods and conspiracy theories. That's very unfortunate but it's not clear to me "social media-based misinformation" is the right way to analyse things.

Expand full comment

I think the chairing of the session was quite odd and you were in a position of having to answer stuff from the floor so couldn't easily elaborate on your points.

Fair point about social media users. And I definitely agree with your general points about the alarmism.

Expand full comment

I think a lot of people in elite circles have mis-diagnosed the problem. It's not a misinformation problem, it's a communication problem.

If they better communicated to the ordinary person online (YouTube, blogs, podcasts etc) it would alleviate a lot of the problems of misinformation.

The internet created a massive information vaccum and basically everyone from traditional institutions missed the boat.

Expand full comment

I agree that misinformation has a long history; however, you seem to be dismissing the post-war period, when it did seem that the social discourse was more civil and fact-based, as a reference point to the current situation. If we believe the argument from the winners of this year’s Economy Nobel, inclusive institutions are crucial to prosperity. A similar argument can be made for a healthy media ecosystem. Institutions that propagate good epistemic norms are better than the current chaos of people saying publicly whatever they feel like saying (just like they used to do in private). Would you say Wikipedia is censorship when it is enforcing rules of good epistemic behaviour? I think your critique is excellent and timely. However, in reacting to the excesses of the misinformation crowd, it seems to be dismissing the point that the current situation with social media is objectively very bad.

Expand full comment

Fair enough - good points. I think the current situation with social media is a mixed bag - in some ways it's maximised variance when it comes to quality, amplifying really bad stuff but also allowing for some really good stuff much better than found in legacy media. I agree we need institutions that propagate good epistemic norms, but that's different from top-down censorship in my view.

Expand full comment

This may be true but the consequences appear to be asymmetrical. Amplifying really bad stuff can have dangerous consequences eg wrt vaccination whereas the presence of really good stuff that is better than legacy media would not appear to do such damage.

@debunkthefunk does terrific work criticizing antivaxxer nonsense. It is doubtless better than much traditional treatment if the problem, but it is only necessary because of the profusion of dangerous nonsense.

Expand full comment

To be clear, I don’t support top down censorship. Peer review in science does prevent some stuff to be published, but I don’t think of it as censorship. Now pre-publication peer review may not be what is needed in the public sphere, but having nothing is worse than anything. As pointed out by the commenter John in this thread, we may be experiencing a loss of variance on social media rather than an increase. Also whether there is some higher quality stuff on social media than in ‘legacy’ media is highly debatable to me. That is admittedly a subjective impression, but maybe people have assessed, or could assess, the variance and quality hypotheses. Lots of important stuff to dig into once we get past the fact that social media is in interaction with humans, not just a driver.

Expand full comment

With reference to your observation about the so-called "current chaos of people saying publicly whatever they feel like saying (just like they used to do in private)" That's the opposite of my experience of the 2020s. I'm 60 years old, and I've never felt like it is more dangerous to express my true opinions in public than it is now or has been for the last 4 years.

We may be living in different environments. I work in a lefty/liberal institution and the limits on what is acceptable to say if you don't want to get cancelled seems very narrow. And some of the things that you can't say anymore would have seemed like common sense truisms 15 years ago. Maybe, a lot of the chaos that you're seeing on social media is a lot of people trapped in highly constrained speech environments cutting loose online.

Expand full comment

You are absolutely correct. I am also in my sixties and my experience is exactly as you describe. The imagery of chaos is about the absence of any organized review of what people say, which happens in science for example ( or Wikipedia!). The illusion was that taking away the referees would lead to greater diversity. And the opposite happened because of human nature. Thanks for your insightful comment.

Expand full comment

This study, for those interested. https://www.science.org/doi/10.1126/sciadv.aay3539

Expand full comment

Is misinformation teaching misthinking, which will take some undoing?

Expand full comment