I think the moral panic around misinformation has such staying power because it is resolves the cognitive dissonance between two facts:
1) Present day western elites considers themselves defenders of democracy and egalitarianism.
2) Actually existing voters have views that elites think are dumb.
Misinformation locates the source of 2) within other elites, hostile foreign countries, corporations, etc and therefore allows people to criticize the views of voters without violating their commitment to 1).
Thanks. The commentary is pretty bad indeed, and that’s a further indictment of Nature if any were still needed. I’ve been disappointed to see Oreskes, whose work on climate change I highly respect, turn into a super-partisan misinformation researcher since the pandemic.
This particular passage was very jarring to me:
"To be proactive — for example, if the misinformation is anticipated but not yet disseminated — psychological inoculation is a strong option that governments and public authorities can consider using. Inoculation involves a forewarning and a pre-emptive correction — or ‘prebunking’ — and it can be fact-based or logic-based."
Ha - yes, there's something alarming about that quote. I have strong disagreements with Oreskes, honestly. Although I was quite persuaded by "The merchants of doubt" when I first came across it, I am tempted to revisit it because I suspect I would view it very differently now that I have engaged with her more recent stuff, including "The Big Myth", which - although it contains some interesting material - primarily involves an extremely uncharitable and implausible analysis of both the content and origins of conservative/libertarian ideas (and I say that as someone who is neither a conservative nor a libertarian). There's a "naive realist" take on the world - roughly, "my worldview is self-evidently correct such that any deviation from it can only be explained by propaganda, lies, and gullibility" - which I think is very misguided and yet seems to underlie lots of her work.
"They also imply that certain critics are deploying the same tactics “used in the decades-long campaigns led by the tobacco and fossil-fuel industries to delay regulation and mitigative action.”"
The authors seemed to be poisoning the well in a way that the logic-based inoculation techniques they endorse should be able to counteract...
The fact that people designing "vaccines" to inoculate the masses against manipulation techniques and reasoning fallacies apparently find it very difficult to avoid such techniques and fallacies themselves doesn't fill one with optimism about the project...
The Carbon Brief fact check of the claim by Bjorn Lomborg that a Lancet study demonstrated a reduction in temperature related mortality from current levels of warming was an absolute masterclass
Their keynote argument was that the study, while it did find cold stress deaths reduced by double the amount of the increase in heat stress deaths, did not specifically attribute this change to temperature. This would have made a provisionary point - except they proceeded to cite studies that specifically attribute a rise in heat related deaths to warming. And, yes, made no mention of whether or not any studies made an attribution claim about reduced cold stress deaths
In other words, the fact check's method for making an attribution claim was manifestly more biased and less reliable. This was a high quality, exhaustive, fact check that substantially reduced my trust in misinformation fact checks as contributing positively to information flows
The simple truth is that if you create institutions with the power to suppress points of view, people who are interested in having that power will be the ones who operate it
Instead, institutions such as prediction markets that create a properly incentivized platform for fairly seeking correct information have much more promise. X community notes doesn't have a formal incentive structure but I trust it far more than officials appointed by politicians
I'm personally agnostic about the size/effect of misinformation but I feel like some of what you have here gives me more concern, not less...
You say: "For example, Republican election denial (a misperception) has complex psychological, social, and political causes, including general psychological biases, intense political polarisation, and institutional distrust. Misinformation undoubtedly plays a role, but estimating the nature and magnitude of that role is complicated. This means the presence of alarmingly popular misperceptions does not vindicate alarmism about misinformation."
It seems like this maybe is a little reductive--if these things are mutually reinforcing, the the presence of polarization and distrust may be evidence of the effect of misinformation. The presence of psychological biases may exacerbate the effects of misinformation.
"Estimating the nature and magnitude" of the role of misinformation is indeed complicated, but the catalogue of confounding factors you present should not be thought of as things that necessarily or obviously diminish the risk or efficacy of misinformation--it seems just as likely that they increase the risk/efficacy, and may in fact "vindicate alarmism."
One of my basic frustrations with institutions is that they are like children when it comes to mechanisms of trust. When I've engaged with members of the technocratic class in the past, citing the likes of Sweden during the pandemic as an example, the standard pushback I get is that they are Swedes! It's a fallacy that a highly educated population leads to better behaviour, instead it's a generational legacy of trust earned by objective, warts and all, information, and treating citizens as though they are responsible adults rather than unruly children who need to be managed.
Let's look at an example. Japan had roughly 1,300 potential mRNA vaccine deaths from myocarditis within two weeks of vaccination. Obviously some of these deaths would have been natural, but let's be generous and place the figure at 1,000 (for easy Maths). There are 125 million people in Japan. Roughly 100 million were double vaccinated. Now, let's divert a little. Let's say your entire older adult at risk, and willing to vaccinate population is vaccinated. Given the risks of dying from Covid was around 1 in 650K for anyone under 30 and the fact that the risks of dying from mRNA vaccines looks like 1 in 100K, doesn't this mean that only those with extreme comorbidities under 30 should have been vaccinated, given that the more vulnerable population already had the best protection possible?
My point is this- misinformation doesn't thrive because of a plethora of the information sources available (or rather the social costs of laissez-faire information would be lower than the social costs of paranoia, the suspicions of game rigging and the loss of social trust from censoring). The reason? The ability to detect deception isn't correlated with general intelligence. Workers know when incentivised pay systems are non-proportional- it's why a thank you at the end of shift is proven to be a better motivator than pay incentives which are non-proportional (whereas proportional pay systems add 50% productivity).
Anyway, all of this is a roundabout way of stating that institutions shouldn't deploy instrumental rationality within science to push policy. Most people would have been OK accepting a tiny risk from vaccination in order to prevent a risk of death in excess of 1% for their mothers and grandmothers. But they will never forgive the institutions for concealing the grim calculus of lives saved versus lives lost- for not being consulted in the adult decision to take a small risk by decide to vaccinate for the benefit of others. Most people actually want a chance to be just a little heroic and self-sacrificing- it's a very human trait not particular to any particular portion of the political spectrum.
Misinformation doesn't thrive because of its ability to spread in a new technological environment, it breeds in the vacuum of trust created by institutions which have a tendency to be the worst enemy of the objective they are pursuing.
Please, please, don't get conned by the lying campaign.
"Brandolini's law / Bullshit Asymmetry Principle"
"The amount of energy needed to refute bullshit is an order of magnitude bigger than that needed to produce it."
The "myocarditis" talking-point is a trick, it relies on a combination of strawmanning public health advice, combined with only counting Covid death as the outcome and neglecting any other destructive effects. If you slice things so finely, for a small subset of the population, you can indeed get to a subset where Covid vaccination is arguably a wash. This is then unfavorably contrasted against the idea that everyone should get vaccinated, and proclaim public health lost everyone's trust, oh woe.
I really understand the censorship argument sometimes. On one side, there's public health fighting a worldwide pandemic, with incomplete information, and being human themselves. THEY SOMETIMES GOT THINGS WRONG! But on the other, there's a bunch of utterly shameless demagogues, willfully and maliciously spreading falsehoods and taking things out of context, attacking the health efforts, just because it's profitable for them. These are moral monsters.
Yet people are supposed to endlessly volunteer to clean up after the bullshit, with a smile on their face while doing it.
Oh, yes, the mainstream media e.g. completely messed up on the Hunter Biden laptop story, utterly biased, wrong, etc - "both sides". But there's just no liberal "misinformation" comparison to the utter and ongoing evil mendacity of the Covid anti-public-health campaign.
[N.b, no offense meant to the commenter - I'm talking about Fox News, some members of Congress, certain talk-show hosts, and similar ilk.]
"Instead, you need to establish that misinformation research qualifies as an objective science of a [sort] that ought to inform technocratic (i.e. expert-driven) policy guidance."
I suspect that many of the people on the other side of the argument genuinely don't understand the philosophy-of-science point I think you've been trying to make, which I take as sort of a structural argument about the nature of specialization or expertise. It might be useful if you wrote a piece walking through some illustrative examples in a less-controversial context to show that there are ways of slicing up the world that are conducive to expertise and others that are less conducive (maybe actual medical specialties compared to hypothetical ones where you'd still essentially be a generalist, or maybe subfields of biology and whether you could be an expert in something like "large organisms").
Not just on the other side of the argument... I've seen educated people argue that public health doesn't qualify as an objective science and doesn't constitute real expertise. Like misinformation research, it can at most hope for a 'best effort'.
The central problem with misinformation is that (lots of) people want to be misinformed. I remember reading something written by church member trying to explain to his fellow members that the Proctor and Gamble satanic symbol story was nonsense (it was actually propagated by Amway IIRC). Rather than being grateful to learn that they could continue to use familiar household products without fear of eternal damnation, they were angry at him for spoiling the fun of spreading the story. Many are doubtless enjoying the same fun with Trump and Fox News.
The unattributed para you cite seems like a perfectly reasonable response to Uscinski et al, and your treatment of it seems like a misrepresentation (though I haven't read it in context). I would read it as "while, as Uscinski et al say, the concept of misinformation should not be applied loosely, there are plenty of significant cases where the is no doubt about the falsehood of the relevant belief: Holocaust denial, election denial and anti-vaxerism are examples.
Reading this para as "Uskinski et al deny the reality of the Holocaust" is uncharitable at best and misleading at worst.
Spelling the point out a bit further, all concepts in political debate are applied too loosely (neoliberalism, socialism, fascism etc). Making this point, by itself, doesn't get you far.
My point is that nobody (including Uscinski et al) deny that there are unambiguous falsehoods. Hence it's irrelevant to point out some. And it is not fundamentally “looseness” that critics object to.
> However, if you think—as seems to be the case—that people are already highly suspicious of manipulation and low-quality misinformation is relatively rare in their information diet,
Excuse me, what? Have you... seen... people? People in Israel see an entirely different set of information concerning the war in Gaza than people in America. Ditto for Russia, Ukraine and, well, America again. In both cases, huge swaths of populations accept the facts uncritically (partially due to not caring). And in both cases, due to sheer contradictions, at least one of the sides must be badly misinformed. (I am certain you can think of within-country examples and non-international topics as well: social bubbles are a thing.)
A major problem with any effort to assist folks in winnowing the relatively reliable from the rest is that we have a long-standing, major industry predicated on what the promulgation of Mark Twain called terminological inexactitudes. Which industry? Advertising and PR.
Over my 70 years of reading/watching/listening to media, it seems inevitable that constant exposure to this considered mendacity helps to make dubious statements seem ordinary, particularly where they have an emotional hook, whether subliminal or not.
It is mostly a story about academic infighting and backstabbing that doesn't really address the quality of Donovan's scholarship. But it is kind of suggestive that there may not be much there there in her "world-famous" research. I'm thinking about digging into it b/c librarians seem to be prominent in her proposed solutions. I wonder if you've read anything by her.
Thanks for this. Yeah, I read this story. Interesting stuff. I don't know her work very well, except that it seems to conform to a certain kind of activist approach that is pretty common in the area.
I think the moral panic around misinformation has such staying power because it is resolves the cognitive dissonance between two facts:
1) Present day western elites considers themselves defenders of democracy and egalitarianism.
2) Actually existing voters have views that elites think are dumb.
Misinformation locates the source of 2) within other elites, hostile foreign countries, corporations, etc and therefore allows people to criticize the views of voters without violating their commitment to 1).
Very good insight
Thanks. The commentary is pretty bad indeed, and that’s a further indictment of Nature if any were still needed. I’ve been disappointed to see Oreskes, whose work on climate change I highly respect, turn into a super-partisan misinformation researcher since the pandemic.
This particular passage was very jarring to me:
"To be proactive — for example, if the misinformation is anticipated but not yet disseminated — psychological inoculation is a strong option that governments and public authorities can consider using. Inoculation involves a forewarning and a pre-emptive correction — or ‘prebunking’ — and it can be fact-based or logic-based."
For the love of god, please no.
Ha - yes, there's something alarming about that quote. I have strong disagreements with Oreskes, honestly. Although I was quite persuaded by "The merchants of doubt" when I first came across it, I am tempted to revisit it because I suspect I would view it very differently now that I have engaged with her more recent stuff, including "The Big Myth", which - although it contains some interesting material - primarily involves an extremely uncharitable and implausible analysis of both the content and origins of conservative/libertarian ideas (and I say that as someone who is neither a conservative nor a libertarian). There's a "naive realist" take on the world - roughly, "my worldview is self-evidently correct such that any deviation from it can only be explained by propaganda, lies, and gullibility" - which I think is very misguided and yet seems to underlie lots of her work.
> and it can be fact-based or logic-based."
Implying these propagandists are capable of either.
"They also imply that certain critics are deploying the same tactics “used in the decades-long campaigns led by the tobacco and fossil-fuel industries to delay regulation and mitigative action.”"
The authors seemed to be poisoning the well in a way that the logic-based inoculation techniques they endorse should be able to counteract...
The fact that people designing "vaccines" to inoculate the masses against manipulation techniques and reasoning fallacies apparently find it very difficult to avoid such techniques and fallacies themselves doesn't fill one with optimism about the project...
"Science" and misinformative rhetoric seem to go hand in hand these days.
The Carbon Brief fact check of the claim by Bjorn Lomborg that a Lancet study demonstrated a reduction in temperature related mortality from current levels of warming was an absolute masterclass
Their keynote argument was that the study, while it did find cold stress deaths reduced by double the amount of the increase in heat stress deaths, did not specifically attribute this change to temperature. This would have made a provisionary point - except they proceeded to cite studies that specifically attribute a rise in heat related deaths to warming. And, yes, made no mention of whether or not any studies made an attribution claim about reduced cold stress deaths
In other words, the fact check's method for making an attribution claim was manifestly more biased and less reliable. This was a high quality, exhaustive, fact check that substantially reduced my trust in misinformation fact checks as contributing positively to information flows
The simple truth is that if you create institutions with the power to suppress points of view, people who are interested in having that power will be the ones who operate it
Instead, institutions such as prediction markets that create a properly incentivized platform for fairly seeking correct information have much more promise. X community notes doesn't have a formal incentive structure but I trust it far more than officials appointed by politicians
I'm personally agnostic about the size/effect of misinformation but I feel like some of what you have here gives me more concern, not less...
You say: "For example, Republican election denial (a misperception) has complex psychological, social, and political causes, including general psychological biases, intense political polarisation, and institutional distrust. Misinformation undoubtedly plays a role, but estimating the nature and magnitude of that role is complicated. This means the presence of alarmingly popular misperceptions does not vindicate alarmism about misinformation."
It seems like this maybe is a little reductive--if these things are mutually reinforcing, the the presence of polarization and distrust may be evidence of the effect of misinformation. The presence of psychological biases may exacerbate the effects of misinformation.
"Estimating the nature and magnitude" of the role of misinformation is indeed complicated, but the catalogue of confounding factors you present should not be thought of as things that necessarily or obviously diminish the risk or efficacy of misinformation--it seems just as likely that they increase the risk/efficacy, and may in fact "vindicate alarmism."
Good point - I agree these things are not independent. I think still people tend to assign too much weight to misinfo specifically in their causal model of misperceptions. I go into the case more here: https://www.conspicuouscognition.com/p/how-dangerous-is-misinformation
One of my basic frustrations with institutions is that they are like children when it comes to mechanisms of trust. When I've engaged with members of the technocratic class in the past, citing the likes of Sweden during the pandemic as an example, the standard pushback I get is that they are Swedes! It's a fallacy that a highly educated population leads to better behaviour, instead it's a generational legacy of trust earned by objective, warts and all, information, and treating citizens as though they are responsible adults rather than unruly children who need to be managed.
Let's look at an example. Japan had roughly 1,300 potential mRNA vaccine deaths from myocarditis within two weeks of vaccination. Obviously some of these deaths would have been natural, but let's be generous and place the figure at 1,000 (for easy Maths). There are 125 million people in Japan. Roughly 100 million were double vaccinated. Now, let's divert a little. Let's say your entire older adult at risk, and willing to vaccinate population is vaccinated. Given the risks of dying from Covid was around 1 in 650K for anyone under 30 and the fact that the risks of dying from mRNA vaccines looks like 1 in 100K, doesn't this mean that only those with extreme comorbidities under 30 should have been vaccinated, given that the more vulnerable population already had the best protection possible?
My point is this- misinformation doesn't thrive because of a plethora of the information sources available (or rather the social costs of laissez-faire information would be lower than the social costs of paranoia, the suspicions of game rigging and the loss of social trust from censoring). The reason? The ability to detect deception isn't correlated with general intelligence. Workers know when incentivised pay systems are non-proportional- it's why a thank you at the end of shift is proven to be a better motivator than pay incentives which are non-proportional (whereas proportional pay systems add 50% productivity).
Anyway, all of this is a roundabout way of stating that institutions shouldn't deploy instrumental rationality within science to push policy. Most people would have been OK accepting a tiny risk from vaccination in order to prevent a risk of death in excess of 1% for their mothers and grandmothers. But they will never forgive the institutions for concealing the grim calculus of lives saved versus lives lost- for not being consulted in the adult decision to take a small risk by decide to vaccinate for the benefit of others. Most people actually want a chance to be just a little heroic and self-sacrificing- it's a very human trait not particular to any particular portion of the political spectrum.
Misinformation doesn't thrive because of its ability to spread in a new technological environment, it breeds in the vacuum of trust created by institutions which have a tendency to be the worst enemy of the objective they are pursuing.
Good points. Agree with a lot of this (although I don't really have any expertise on or much knowledge about the specific vaccine/myocarditis stuff).
Please, please, don't get conned by the lying campaign.
"Brandolini's law / Bullshit Asymmetry Principle"
"The amount of energy needed to refute bullshit is an order of magnitude bigger than that needed to produce it."
The "myocarditis" talking-point is a trick, it relies on a combination of strawmanning public health advice, combined with only counting Covid death as the outcome and neglecting any other destructive effects. If you slice things so finely, for a small subset of the population, you can indeed get to a subset where Covid vaccination is arguably a wash. This is then unfavorably contrasted against the idea that everyone should get vaccinated, and proclaim public health lost everyone's trust, oh woe.
I really understand the censorship argument sometimes. On one side, there's public health fighting a worldwide pandemic, with incomplete information, and being human themselves. THEY SOMETIMES GOT THINGS WRONG! But on the other, there's a bunch of utterly shameless demagogues, willfully and maliciously spreading falsehoods and taking things out of context, attacking the health efforts, just because it's profitable for them. These are moral monsters.
Yet people are supposed to endlessly volunteer to clean up after the bullshit, with a smile on their face while doing it.
Oh, yes, the mainstream media e.g. completely messed up on the Hunter Biden laptop story, utterly biased, wrong, etc - "both sides". But there's just no liberal "misinformation" comparison to the utter and ongoing evil mendacity of the Covid anti-public-health campaign.
[N.b, no offense meant to the commenter - I'm talking about Fox News, some members of Congress, certain talk-show hosts, and similar ilk.]
Perspicuous, thorough, judicious. A tonic essay. Bravo!
hear! hear!
In fact, Oreskes' main contribution was that the disinformation was linked to the right, which I now find suspicious.
Nature may once have been a top scientific journal, but now it is increasingly junk. Which is a shame, but its their own damn fault.
"Instead, you need to establish that misinformation research qualifies as an objective science of a [sort] that ought to inform technocratic (i.e. expert-driven) policy guidance."
I suspect that many of the people on the other side of the argument genuinely don't understand the philosophy-of-science point I think you've been trying to make, which I take as sort of a structural argument about the nature of specialization or expertise. It might be useful if you wrote a piece walking through some illustrative examples in a less-controversial context to show that there are ways of slicing up the world that are conducive to expertise and others that are less conducive (maybe actual medical specialties compared to hypothetical ones where you'd still essentially be a generalist, or maybe subfields of biology and whether you could be an expert in something like "large organisms").
Yeah, I think you're right that this point is often not understood. Thanks for the advice - I might write a piece specifically on this.
Not just on the other side of the argument... I've seen educated people argue that public health doesn't qualify as an objective science and doesn't constitute real expertise. Like misinformation research, it can at most hope for a 'best effort'.
The central problem with misinformation is that (lots of) people want to be misinformed. I remember reading something written by church member trying to explain to his fellow members that the Proctor and Gamble satanic symbol story was nonsense (it was actually propagated by Amway IIRC). Rather than being grateful to learn that they could continue to use familiar household products without fear of eternal damnation, they were angry at him for spoiling the fun of spreading the story. Many are doubtless enjoying the same fun with Trump and Fox News.
The unattributed para you cite seems like a perfectly reasonable response to Uscinski et al, and your treatment of it seems like a misrepresentation (though I haven't read it in context). I would read it as "while, as Uscinski et al say, the concept of misinformation should not be applied loosely, there are plenty of significant cases where the is no doubt about the falsehood of the relevant belief: Holocaust denial, election denial and anti-vaxerism are examples.
Reading this para as "Uskinski et al deny the reality of the Holocaust" is uncharitable at best and misleading at worst.
Spelling the point out a bit further, all concepts in political debate are applied too loosely (neoliberalism, socialism, fascism etc). Making this point, by itself, doesn't get you far.
My point is that nobody (including Uscinski et al) deny that there are unambiguous falsehoods. Hence it's irrelevant to point out some. And it is not fundamentally “looseness” that critics object to.
> However, if you think—as seems to be the case—that people are already highly suspicious of manipulation and low-quality misinformation is relatively rare in their information diet,
Excuse me, what? Have you... seen... people? People in Israel see an entirely different set of information concerning the war in Gaza than people in America. Ditto for Russia, Ukraine and, well, America again. In both cases, huge swaths of populations accept the facts uncritically (partially due to not caring). And in both cases, due to sheer contradictions, at least one of the sides must be badly misinformed. (I am certain you can think of within-country examples and non-international topics as well: social bubbles are a thing.)
Of course. Cherry-picked information is pervasive - as I've argued extensively elsewhere. However, this is often not because of gullibility but because people are tribal, biased, and more concerned with rationalising favoured narratives than getting at the truth. As I explore here: https://www.cambridge.org/core/journals/economics-and-philosophy/article/marketplace-of-rationalizations/41FB096344BD344908C7C992D0C0C0DC
How does this square with the claim that "people are already highly suspicious of manipulation"?
If their goal is to get the truth. If it's not, they welcome biased and misleading communication - so it's not best-understood as "manipulation".
Why not? Exploiting preexisting biases seems to be at the core of most manipulations.
Sometimes - but sometimes it's not about exploiting but catering to audience demand.
A major problem with any effort to assist folks in winnowing the relatively reliable from the rest is that we have a long-standing, major industry predicated on what the promulgation of Mark Twain called terminological inexactitudes. Which industry? Advertising and PR.
Over my 70 years of reading/watching/listening to media, it seems inevitable that constant exposure to this considered mendacity helps to make dubious statements seem ordinary, particularly where they have an emotional hook, whether subliminal or not.
Have you seen this: https://www.chronicle.com/article/the-distortions-of-joan-donovan -- "The Distortions of Joan Donavan: Is a world-famous misinformation expert spreading misinformation."
It is mostly a story about academic infighting and backstabbing that doesn't really address the quality of Donovan's scholarship. But it is kind of suggestive that there may not be much there there in her "world-famous" research. I'm thinking about digging into it b/c librarians seem to be prominent in her proposed solutions. I wonder if you've read anything by her.
Thanks for this. Yeah, I read this story. Interesting stuff. I don't know her work very well, except that it seems to conform to a certain kind of activist approach that is pretty common in the area.
It's seems odd to be to look at the events of Jan 6 and claim that misinformation is not a serious threat.