There is no such thing as a 'misinformation expert'. I can think of nothing more chillingly Orwellian than the very concept of a misinformation expert. Anyone with a reasonable grasp of the interface between human nature and man’s inherent epistemological limitations could not seriously entertain such a notion without choking on their hubris sandwich. In fact there are far fewer 'experts' on any subject than we are currently seduced into passively accepting. It seems like you cannot get more than two sentences into an MSM article or broadcast these days without an ‘expert’ being invoked as an authority on whatever is being discussed. I wrote on this subject recently (although I am not an 'expert' on it): https://grahamcunningham.substack.com/p/take-me-to-your-experts
Good post, as usual Dan. One thing I want to add to the critiques of misinformation studies is that it seems to presuppose that researchers in the field are above the biases of virtually all human beings, either because of exceptionally good methodologies or because they are simply superior human beings.
I'm not really opposed to either. Maybe they actually do have superior methods, or maybe these researchers are essentially superhumans who have overcome petty things like bias and partisanship. But I would just really appreciate if they would demonstrate that this is true to fallible people like me instead of just telling me to trust that they're not partisan idiots.
I think they genuinely believe that their understanding of psychological biases makes them somewhat immune to spreading misinformation while of course making them expert at spotting it.
In reality, this view of the world seems to lead to hubris and above all (and quite ironically) a failure to understand quite how powerful 'my side' bias is leaves them hopelessly biased. A commitment to not being either 'left' or 'right' would be helpful but in practice every human is fallible and they are no different.
Most of the people currently doing "misinformation research" seem to me to be unfit to do so, because they are (a) highly politically partisan (b) unaware of the level of uncertainty in some of their own beliefs. Being able to distinguish between claims that are clearly false vs claims that might be true would seem to be a prerequisite; if the researchers own epistemics are worse than the "misinformation" they're trying to study, then it's going to get very strange.
I'm not sure about "most" - I know (and know of) many good, thoughtful reasonable researchers who strive to avoid partisan bias - but I completely agree it's alarmingly common.
"More importantly, the debate also illustrated how the liberal establishment’s approach to “misinformation” can be very costly. It might be emotionally comforting to classify all challenges to preferred liberal orthodoxies of the day as “misinformation”. It might even serve short-term propagandistic purposes. However, it ultimately makes you look silly in the eyes of a less biased public and erodes trust in social science and liberal media outlets."
I think it's also worth considering whether sustained accusations of misinformation and conspiracism that look like they come from the mainstream/left against the right can have the paradoxical effect of eroding norms on the right -- if otherwise-sensible people on the right are constantly told that conspiracy-mongers are their political allies they may start acting like they are. Nobody likes to unilaterally disarm, and people with weird or misinformed ideas can still be voters.
Much of this seems to be wrestling with problems such as there's a wide range of typical statements that are probably false in letter, but pretty true in spirit. "Biden has Alzheimer's" vs "Biden sometimes sounds feeble, frail, even confused". The latter is undeniably true. The former as a medical statement is technically a strong claim to a medical diagnosis (it might become clearly true at some future time, in which case people saying it as a deliberate lie now, will cry they were victims of censorship, groupthink, echo chamber, etc).
However, it's not new information that Biden's age is showing on him. Anyone who has watched him walk, or listened to him off-teleprompter even before the debate, can observe this. The issue is you go to the election with the candidate you have, not the candidate you wished you had. But sometimes when the dice are rolled, they crap out. Much of the punditry on this topic seems to me people saying everyone should have known the outcome of the dice-roll before it was done, and the pundits then apply the infallibility of 20/20 hindsight.
The political calculation here is clear: The only path which doesn't tear apart the Democratic Party would be to replace Biden with Vice-President Harris. And even putting aside identity-politics of being a women of color, she is an utterly terrible campaigner. She has no charisma, no skill at running a national campaign, and no experience appealing to the "swing voters" desperately needed for a Democratic victory. Avoiding an almost-certain complete disaster by taking a chance on a lesser disaster is completely rational.
Very good points - I agree, especially about people treating uncertain outcomes as obvious ex post.
You make a good point re. Harris. In my view, it is still possible to opt for a candidate other than Harris, although very difficult for the reasons you say. However, even if Harris is the only option, I would still roll the dice and opt for her over Biden right now. In my view, Biden looks like her a sure loss; with her, there's more variance.
Misinformation about Kamala Harris! Just kidding. No, really - it's misinformation.
See, I'm not sure. "Misinformation" refers to propositions that seem in some to betray truth. Which is to say, any definition of misinformation depends upon an incontestable definition of truth, which we can never have. Truth is always provisional, inter-subjective, contingent upon some undefined threshold of consensus. Even so-called "objective truth" is ultimately inter-subjective, just a provisional consensus.
The two of you and many others may consider it true that Harris lacks charisma. Many others will disagree. But when her lack of charisma is presented implicitly as "truth" (as above), some of those who were inclined to disagree may suddenly feel slightly more inclined to agree. Have they been informed or "misinformed"? I don't think there's a scientific way of making the distinction; and perhaps pretending there is is ultimately what's damaging.
All communication presumes a certain shared background framework between writer and reader, of implicit assumptions and common meanings. It is unduly burdensome if every statement about the world must grapple with the deep philosophical problem of What Is Truth, and include a dissertation on said matters. I believe that such points should only be brought up if they are directly relevant to the argument being made, otherwise we could say little - especially in contexts where space is highly constrained, like post comments!
That is, "[Harris] has no charisma" is a few words, which gesture at a conventional wisdom. They should not need to be wrapped in layers of pedantic technical hedging, as that serves no communicative purpose. The assertion can be justified if needed by reference to the extensive uninspired reactions voters have to her speeches. Note, it is not a claim that no voter, anywhere, ever, finds her inspiring (see "KHive") - that's uncharitable reading. The "scientific" aspect here is that these are statements which carry meaning which can be examined and possibly refuted, and a lack of absolute precision should not lead to a wholesale discounting of overall conceptual utility.
[Off-topic - the "six months" post seems to be subscribers-only for comments, I'm not sure that was intended.]
“The single biggest problem in communication is the illusion that it has taken place." George Bernard Shaw said that.
I say, insofar as language presumes and assumes common meanings, as it certainly does, we must also presume and assume that "truth" is provisional. This is why we need not burden ourselves with "layers of pedantic technical hedging," as you say, when defending whatever provisional definition of "misinformation" we wish to defend.
But my central point, in case it was overlooked, is that language serves more than a merely *designative* function. Language does more than just describe, represent, and refer to what's already real in the world. Language also *constitutes* what is real. When we communicate with one another, language not only designates, but also generates truth. This is just the obverse of calling truth "inter-subjective."
When we account for both the designative and constitutive functions of language, the meaning of "misinformation" becomes paradoxical. None of this demands that we dive deep into philosophy with each assertion we make. But it does suggest the sense in which each assertion we make, insofar as it reifies provisional truths, could be classified a kind of "misinformation." Ultimately the whole effort to categorically identify so-called misinformation becomes folly.
As someone who thinks that misinformation research is an important, challenging, and fascinating endeavor, under unreasonable attack from some quarters, I find it really disheartening that certain misinformation researchers are themselves spreading misinformation by use of such expansive definitions. Maybe there's a paradox at the heart of the field, but hopefully we can get to a reasonable consensus view of what should and should not count.
Yes, I agree on all counts. It's a difficult balancing act. There are lots of bad faith critiques of misinformation research. However, there is also lots of quite bad misinformation research, and more generally a problematic tendency among liberals to use the "misinformation" frame inappropriately in ways that lead to bad decision making.
(It's also worth noting that there's plenty of research broadly on the topic of "misinformation" in the sense that it involves abstract discussion about the psychological and social forces driving media bias and public opinion - Lippmann's work being a classic example - that doesn't necessarily involve making contentious first-order judgements about what constitutes misinformation in the context of live political debates).
I think one fundamental issue may be that the field of “misinformation research” by it’s very nature is likely to attract people who are, well, control freaks who want to have a say in what other people see and think. Not that all of them are like that, but I think it will be a systemic risk in the field that will need to be addressed if there’s any hope of it appearing non-partisan or ideological.
It's possible Martin, but that's not at all true of the folks I am familiar with and assign to my students. The examples cited by Dan are egregious but not typical in my experience. And some people (Renee DiResta in particular) have been horribly and unfairly demonized to promote the narrative of a censorship-industrial complex. My feeling is that your comment applies more to some influential folks on social media who try to shut others down, rather than to the researchers themselves. But ultimately this is an empirical question and I don't know the answer.
My impression is that the entire area of misinformation research arose as a reaction to the rise of Donald Trump (i.e. how could people vote for this guy unless their minds have been corrupted by lies?). Is this impression incorrect?
I think the Trump's election definitely accelerated work on this, but there was a lot of interest in the role of social media in driving affective polarization and conspiracy beliefs before that. Work on affective polarization (people despising rather than just disagreeing with political opponents) dates back to at least 2012, and certain conspiracy beliefs that Trump was happy to leverage (Obama is a Kenyan-born Muslim, etc) were quite widely believed much earlier. ISIS was also using social media for propaganda and recruitment very effectively well before the 2016 election, the St. Petersburg based IRA had cultivated several fake social media accounts within highly specialized online groups. So I think that the story is more interesting and nuanced than just a reaction to Trump.
That said, I think that opposition to Trump increased interest in the field and actually weakened the quality of some research, as Dan has pointed out.
I highly recommend two papers that I ask my students to read (links below). One is by the economists Gentzkow and Shapiro and discusses financial motives driving teenagers to create fake viral stories (the Pope endorsed Trump for example). The second is by DiResta and collaborators and is focused on the IRA.
Renee DiResta is one of the most unfairly maligned and demonized people in this discourse. Ludicrous lies about her have been propagated widely, including in this very thread. I will write about this in more detail at some point, after I finish Invisible Rulers. I urge people to read her tactics and tropes paper and her book, listen to her on Joe Rogan and Sam Harris (including the episode with Shellenberger), and decide for themselves.
There's a big difference between a far left consensus and a bipartisan consensus. Misinformation researchers are notoriously partisan left wingers, which is why the Hunter Biden laptop gets (wrongly) classified as misinformation within minutes while it took years for Snopes to acknowledge that Trump didn't actually call Neo Nazis "very fine people".
The platforms made poor decisions regarding the laptop story, as well as the shadow banning of dissenting voices such as Bhattacharya and Kulldorff during the pandemic. These decisions were not made by misinformation researchers of any ideological stripe. Ideology does affect the quality of academic research in insidious ways, as Dan has pointed out. But your blanket comments about misinformation researchers is off base in my opinion, and just contributes to what Jonathan Rauch has called the firehose of falsehood.
"decisions were not made by misinformation researchers" is a cop-out. That's like saying "Osama Bin Laden didn't fly any planes into the World Trade Center towers".
The ostensible "experts" pressured the social media platforms, and the platforms typically acquiesed and censored based on the "expert advice" they were receiving.
Misinformation researchers are like academia as a whole; there is some of what you might call political diversity, but it only spans between the left and the far-left.
As I wrote elsewhere:
There are two main problems with policing “mis-, dis-, and malinformation”. First, much (or even most) of what gets suppressed is factually accurate but simply inconvenient or embarrassing to elites currently holding power (malinformation literally means truthful)
Second, elites themselves still have free reign to play fast and loose with claims like “transitory inflation” and “secure borders” and “mass graves” and “Covid vaccines prevent both infection and transmission” without ever getting called out by their misinformation mercenaries in the “fact checking” sphere. At best, the lies are eventually memory-holed; Nellie Bowles estimates that “two years is usually the amount of time that passes before “fake news” can become “common knowledge”.
As Jacob Siegel put it in Tablet:
"In a technical or structural sense, the censorship regime’s aim is not to censor or to oppress, but to rule. That’s why the authorities can never be labeled as guilty of disinformation. Not when they lied about Hunter Biden’s laptops, not when they claimed that the lab leak was a racist conspiracy, not when they said that vaccines stopped transmission of the novel coronavirus. Disinformation, now and for all time, is whatever they say it is. That is not a sign that the concept is being misused or corrupted; it is the precise functioning of a totalitarian system."
I see. Misinformation researchers like Gentzkow and Shapiro, who are producing valuable research that I ask my students to read, are to platform censorship what Osama Bin Laden was to the 9/11 highjackers.
No she does not. She has been maliciously demonized. The ludicrous lies that have been told about her are utterly shameful. I will write a post on this once I finish Invisible Rulers.
As I said, I have no interest in engaging with you further, but responded here because I want other readers to look closely at what has been said about Renee, listen to her on Rogan and Harris, read her book, and make up their own minds.
Yes perfectly said. I remember in high school my teacher had half of us write for an argument and half against it using the same set of primary sources. It was a part of her broader educational goal to get us to understand rhetoric around us. In our current broader definition of “misinformation” half of the class would be labeled as misinformers by the other rather than just ones who lied about the primary sources.
As someone who did some research in the area of fuzzy sets and fuzzy logic I am wondering if we couldn't mitigate the problem by replacing a simple yes/no apprpach to classification of content as misinformation with something like a degree of disinformation on a scale from 0 to 1. Of course, the follow-up question would be how to determine this degree for a given piece of content
The post makes a valid critique of misinformation research, yet arrives at a conclusion about the presidential race that does not seem to follow from the critique.
A large number of Republican "strategists" and of the "establishment" (such as it is) tried hard to prevent Trump from being the nominee in 2016/2020/2024. It made not a jot of difference.
What and how are Democratic "strategists" supposed to do exactly to summon challengers to an incumbent, line up donors behind such challengers, rig the system in some way to produce desirable results.
There is no magic behind the curtain. The very same "mostly uninformed" who "pay very little attention" elect nominees and a subset of them in so-called battleground states determine the winner.
A strategist with views perfectly aligned to objective facts, and not misled by misinformation experts will do precisely zero to change any outcome.
I'm not sure this a problem with a solution. Even Daniel Kahneman once noted that, despite devoting his career to studying cognitive biases, he himself was no less vulnerable to them. You can't reliably study misinformation without first having a handle on truth - and our brains are very bad at this, with the brains of intelligent people being perhaps the worst of all.
If people prefer Biden in his condition to Trump I totally get it. What I hate is lying and the support of the lying that there is nothing wrong with Biden. I just got blocked by some twit for stating that. It’s the constant pushback against the truth that I find disgusting. And then the OMG Biden needs to quit routine. NY Times, Tom Friedman, CNN. You all fucking knew.
“… erodes trust in social science and liberal media outlets.”
That trust was forfeited long ago.
Anyone in power who has the ability to control access of information will do it. It’s a temptation. no one in authority could ever resist. It’s a lot easier to propagate a lie than to deal with an underlying, real world problem.
There is no such thing as a 'misinformation expert'. I can think of nothing more chillingly Orwellian than the very concept of a misinformation expert. Anyone with a reasonable grasp of the interface between human nature and man’s inherent epistemological limitations could not seriously entertain such a notion without choking on their hubris sandwich. In fact there are far fewer 'experts' on any subject than we are currently seduced into passively accepting. It seems like you cannot get more than two sentences into an MSM article or broadcast these days without an ‘expert’ being invoked as an authority on whatever is being discussed. I wrote on this subject recently (although I am not an 'expert' on it): https://grahamcunningham.substack.com/p/take-me-to-your-experts
I agree that extreme scepticism is warranted when it comes to anything other than completely uncontroversial classifications.
Good post, as usual Dan. One thing I want to add to the critiques of misinformation studies is that it seems to presuppose that researchers in the field are above the biases of virtually all human beings, either because of exceptionally good methodologies or because they are simply superior human beings.
I'm not really opposed to either. Maybe they actually do have superior methods, or maybe these researchers are essentially superhumans who have overcome petty things like bias and partisanship. But I would just really appreciate if they would demonstrate that this is true to fallible people like me instead of just telling me to trust that they're not partisan idiots.
Yes, I agree - that presupposition does often seem to play a role , and it would be good to see more justification of it.
I think they genuinely believe that their understanding of psychological biases makes them somewhat immune to spreading misinformation while of course making them expert at spotting it.
In reality, this view of the world seems to lead to hubris and above all (and quite ironically) a failure to understand quite how powerful 'my side' bias is leaves them hopelessly biased. A commitment to not being either 'left' or 'right' would be helpful but in practice every human is fallible and they are no different.
Most of the people currently doing "misinformation research" seem to me to be unfit to do so, because they are (a) highly politically partisan (b) unaware of the level of uncertainty in some of their own beliefs. Being able to distinguish between claims that are clearly false vs claims that might be true would seem to be a prerequisite; if the researchers own epistemics are worse than the "misinformation" they're trying to study, then it's going to get very strange.
I'm not sure about "most" - I know (and know of) many good, thoughtful reasonable researchers who strive to avoid partisan bias - but I completely agree it's alarmingly common.
"More importantly, the debate also illustrated how the liberal establishment’s approach to “misinformation” can be very costly. It might be emotionally comforting to classify all challenges to preferred liberal orthodoxies of the day as “misinformation”. It might even serve short-term propagandistic purposes. However, it ultimately makes you look silly in the eyes of a less biased public and erodes trust in social science and liberal media outlets."
I think it's also worth considering whether sustained accusations of misinformation and conspiracism that look like they come from the mainstream/left against the right can have the paradoxical effect of eroding norms on the right -- if otherwise-sensible people on the right are constantly told that conspiracy-mongers are their political allies they may start acting like they are. Nobody likes to unilaterally disarm, and people with weird or misinformed ideas can still be voters.
Great point. There's definitely something to that.
Much of this seems to be wrestling with problems such as there's a wide range of typical statements that are probably false in letter, but pretty true in spirit. "Biden has Alzheimer's" vs "Biden sometimes sounds feeble, frail, even confused". The latter is undeniably true. The former as a medical statement is technically a strong claim to a medical diagnosis (it might become clearly true at some future time, in which case people saying it as a deliberate lie now, will cry they were victims of censorship, groupthink, echo chamber, etc).
However, it's not new information that Biden's age is showing on him. Anyone who has watched him walk, or listened to him off-teleprompter even before the debate, can observe this. The issue is you go to the election with the candidate you have, not the candidate you wished you had. But sometimes when the dice are rolled, they crap out. Much of the punditry on this topic seems to me people saying everyone should have known the outcome of the dice-roll before it was done, and the pundits then apply the infallibility of 20/20 hindsight.
The political calculation here is clear: The only path which doesn't tear apart the Democratic Party would be to replace Biden with Vice-President Harris. And even putting aside identity-politics of being a women of color, she is an utterly terrible campaigner. She has no charisma, no skill at running a national campaign, and no experience appealing to the "swing voters" desperately needed for a Democratic victory. Avoiding an almost-certain complete disaster by taking a chance on a lesser disaster is completely rational.
Very good points - I agree, especially about people treating uncertain outcomes as obvious ex post.
You make a good point re. Harris. In my view, it is still possible to opt for a candidate other than Harris, although very difficult for the reasons you say. However, even if Harris is the only option, I would still roll the dice and opt for her over Biden right now. In my view, Biden looks like her a sure loss; with her, there's more variance.
Misinformation about Kamala Harris! Just kidding. No, really - it's misinformation.
See, I'm not sure. "Misinformation" refers to propositions that seem in some to betray truth. Which is to say, any definition of misinformation depends upon an incontestable definition of truth, which we can never have. Truth is always provisional, inter-subjective, contingent upon some undefined threshold of consensus. Even so-called "objective truth" is ultimately inter-subjective, just a provisional consensus.
The two of you and many others may consider it true that Harris lacks charisma. Many others will disagree. But when her lack of charisma is presented implicitly as "truth" (as above), some of those who were inclined to disagree may suddenly feel slightly more inclined to agree. Have they been informed or "misinformed"? I don't think there's a scientific way of making the distinction; and perhaps pretending there is is ultimately what's damaging.
All communication presumes a certain shared background framework between writer and reader, of implicit assumptions and common meanings. It is unduly burdensome if every statement about the world must grapple with the deep philosophical problem of What Is Truth, and include a dissertation on said matters. I believe that such points should only be brought up if they are directly relevant to the argument being made, otherwise we could say little - especially in contexts where space is highly constrained, like post comments!
That is, "[Harris] has no charisma" is a few words, which gesture at a conventional wisdom. They should not need to be wrapped in layers of pedantic technical hedging, as that serves no communicative purpose. The assertion can be justified if needed by reference to the extensive uninspired reactions voters have to her speeches. Note, it is not a claim that no voter, anywhere, ever, finds her inspiring (see "KHive") - that's uncharitable reading. The "scientific" aspect here is that these are statements which carry meaning which can be examined and possibly refuted, and a lack of absolute precision should not lead to a wholesale discounting of overall conceptual utility.
[Off-topic - the "six months" post seems to be subscribers-only for comments, I'm not sure that was intended.]
“The single biggest problem in communication is the illusion that it has taken place." George Bernard Shaw said that.
I say, insofar as language presumes and assumes common meanings, as it certainly does, we must also presume and assume that "truth" is provisional. This is why we need not burden ourselves with "layers of pedantic technical hedging," as you say, when defending whatever provisional definition of "misinformation" we wish to defend.
But my central point, in case it was overlooked, is that language serves more than a merely *designative* function. Language does more than just describe, represent, and refer to what's already real in the world. Language also *constitutes* what is real. When we communicate with one another, language not only designates, but also generates truth. This is just the obverse of calling truth "inter-subjective."
When we account for both the designative and constitutive functions of language, the meaning of "misinformation" becomes paradoxical. None of this demands that we dive deep into philosophy with each assertion we make. But it does suggest the sense in which each assertion we make, insofar as it reifies provisional truths, could be classified a kind of "misinformation." Ultimately the whole effort to categorically identify so-called misinformation becomes folly.
I strongly agree with this article, but I find the heading’s use of “cope” weird. There must be a better word.
Ha. Fair. I'm terrible with titles.
"Cope" (n) - a lie told to oneself to mask an unwanted reality. I think it's exactly the right word.
As someone who thinks that misinformation research is an important, challenging, and fascinating endeavor, under unreasonable attack from some quarters, I find it really disheartening that certain misinformation researchers are themselves spreading misinformation by use of such expansive definitions. Maybe there's a paradox at the heart of the field, but hopefully we can get to a reasonable consensus view of what should and should not count.
Yes, I agree on all counts. It's a difficult balancing act. There are lots of bad faith critiques of misinformation research. However, there is also lots of quite bad misinformation research, and more generally a problematic tendency among liberals to use the "misinformation" frame inappropriately in ways that lead to bad decision making.
(It's also worth noting that there's plenty of research broadly on the topic of "misinformation" in the sense that it involves abstract discussion about the psychological and social forces driving media bias and public opinion - Lippmann's work being a classic example - that doesn't necessarily involve making contentious first-order judgements about what constitutes misinformation in the context of live political debates).
I think one fundamental issue may be that the field of “misinformation research” by it’s very nature is likely to attract people who are, well, control freaks who want to have a say in what other people see and think. Not that all of them are like that, but I think it will be a systemic risk in the field that will need to be addressed if there’s any hope of it appearing non-partisan or ideological.
It's possible Martin, but that's not at all true of the folks I am familiar with and assign to my students. The examples cited by Dan are egregious but not typical in my experience. And some people (Renee DiResta in particular) have been horribly and unfairly demonized to promote the narrative of a censorship-industrial complex. My feeling is that your comment applies more to some influential folks on social media who try to shut others down, rather than to the researchers themselves. But ultimately this is an empirical question and I don't know the answer.
My impression is that the entire area of misinformation research arose as a reaction to the rise of Donald Trump (i.e. how could people vote for this guy unless their minds have been corrupted by lies?). Is this impression incorrect?
I think the Trump's election definitely accelerated work on this, but there was a lot of interest in the role of social media in driving affective polarization and conspiracy beliefs before that. Work on affective polarization (people despising rather than just disagreeing with political opponents) dates back to at least 2012, and certain conspiracy beliefs that Trump was happy to leverage (Obama is a Kenyan-born Muslim, etc) were quite widely believed much earlier. ISIS was also using social media for propaganda and recruitment very effectively well before the 2016 election, the St. Petersburg based IRA had cultivated several fake social media accounts within highly specialized online groups. So I think that the story is more interesting and nuanced than just a reaction to Trump.
That said, I think that opposition to Trump increased interest in the field and actually weakened the quality of some research, as Dan has pointed out.
I highly recommend two papers that I ask my students to read (links below). One is by the economists Gentzkow and Shapiro and discusses financial motives driving teenagers to create fake viral stories (the Pope endorsed Trump for example). The second is by DiResta and collaborators and is focused on the IRA.
Renee DiResta is one of the most unfairly maligned and demonized people in this discourse. Ludicrous lies about her have been propagated widely, including in this very thread. I will write about this in more detail at some point, after I finish Invisible Rulers. I urge people to read her tactics and tropes paper and her book, listen to her on Joe Rogan and Sam Harris (including the episode with Shellenberger), and decide for themselves.
Links to papers:
https://www.aeaweb.org/articles?id=10.1257%2Fjep.31.2.211&fbclid=IwAR04My3
https://digitalcommons.unl.edu/senatedocs/2/?ref=reneediresta.com
There's a big difference between a far left consensus and a bipartisan consensus. Misinformation researchers are notoriously partisan left wingers, which is why the Hunter Biden laptop gets (wrongly) classified as misinformation within minutes while it took years for Snopes to acknowledge that Trump didn't actually call Neo Nazis "very fine people".
The platforms made poor decisions regarding the laptop story, as well as the shadow banning of dissenting voices such as Bhattacharya and Kulldorff during the pandemic. These decisions were not made by misinformation researchers of any ideological stripe. Ideology does affect the quality of academic research in insidious ways, as Dan has pointed out. But your blanket comments about misinformation researchers is off base in my opinion, and just contributes to what Jonathan Rauch has called the firehose of falsehood.
"decisions were not made by misinformation researchers" is a cop-out. That's like saying "Osama Bin Laden didn't fly any planes into the World Trade Center towers".
The ostensible "experts" pressured the social media platforms, and the platforms typically acquiesed and censored based on the "expert advice" they were receiving.
Misinformation researchers are like academia as a whole; there is some of what you might call political diversity, but it only spans between the left and the far-left.
As I wrote elsewhere:
There are two main problems with policing “mis-, dis-, and malinformation”. First, much (or even most) of what gets suppressed is factually accurate but simply inconvenient or embarrassing to elites currently holding power (malinformation literally means truthful)
Second, elites themselves still have free reign to play fast and loose with claims like “transitory inflation” and “secure borders” and “mass graves” and “Covid vaccines prevent both infection and transmission” without ever getting called out by their misinformation mercenaries in the “fact checking” sphere. At best, the lies are eventually memory-holed; Nellie Bowles estimates that “two years is usually the amount of time that passes before “fake news” can become “common knowledge”.
As Jacob Siegel put it in Tablet:
"In a technical or structural sense, the censorship regime’s aim is not to censor or to oppress, but to rule. That’s why the authorities can never be labeled as guilty of disinformation. Not when they lied about Hunter Biden’s laptops, not when they claimed that the lab leak was a racist conspiracy, not when they said that vaccines stopped transmission of the novel coronavirus. Disinformation, now and for all time, is whatever they say it is. That is not a sign that the concept is being misused or corrupted; it is the precise functioning of a totalitarian system."
From https://milesmcstylez.substack.com/p/populists-are-trying-to-save-democracy
By all means, give it a read and tell me exactly where the "firehose of falsehood" supposedly is.
I see. Misinformation researchers like Gentzkow and Shapiro, who are producing valuable research that I ask my students to read, are to platform censorship what Osama Bin Laden was to the 9/11 highjackers.
Nothing to be gained from engaging with you.
Yes, pretty much.
Renee DiResta certainly fits that description vis a vis platform censorship, yes.
No she does not. She has been maliciously demonized. The ludicrous lies that have been told about her are utterly shameful. I will write a post on this once I finish Invisible Rulers.
As I said, I have no interest in engaging with you further, but responded here because I want other readers to look closely at what has been said about Renee, listen to her on Rogan and Harris, read her book, and make up their own minds.
Yes perfectly said. I remember in high school my teacher had half of us write for an argument and half against it using the same set of primary sources. It was a part of her broader educational goal to get us to understand rhetoric around us. In our current broader definition of “misinformation” half of the class would be labeled as misinformers by the other rather than just ones who lied about the primary sources.
Good to see you vindicated by events.
As someone who did some research in the area of fuzzy sets and fuzzy logic I am wondering if we couldn't mitigate the problem by replacing a simple yes/no apprpach to classification of content as misinformation with something like a degree of disinformation on a scale from 0 to 1. Of course, the follow-up question would be how to determine this degree for a given piece of content
The post makes a valid critique of misinformation research, yet arrives at a conclusion about the presidential race that does not seem to follow from the critique.
A large number of Republican "strategists" and of the "establishment" (such as it is) tried hard to prevent Trump from being the nominee in 2016/2020/2024. It made not a jot of difference.
What and how are Democratic "strategists" supposed to do exactly to summon challengers to an incumbent, line up donors behind such challengers, rig the system in some way to produce desirable results.
There is no magic behind the curtain. The very same "mostly uninformed" who "pay very little attention" elect nominees and a subset of them in so-called battleground states determine the winner.
A strategist with views perfectly aligned to objective facts, and not misled by misinformation experts will do precisely zero to change any outcome.
Interesting point - you might be right.
Excellent. Has Sander van der Linden ever written anything that could be charitably interpreted as true and interesting?
I enjoyed parts of his book despite disagreeing with it.
You’re a good soul.
I'm not sure this a problem with a solution. Even Daniel Kahneman once noted that, despite devoting his career to studying cognitive biases, he himself was no less vulnerable to them. You can't reliably study misinformation without first having a handle on truth - and our brains are very bad at this, with the brains of intelligent people being perhaps the worst of all.
Yep - it's a challenge, no doubt
If people prefer Biden in his condition to Trump I totally get it. What I hate is lying and the support of the lying that there is nothing wrong with Biden. I just got blocked by some twit for stating that. It’s the constant pushback against the truth that I find disgusting. And then the OMG Biden needs to quit routine. NY Times, Tom Friedman, CNN. You all fucking knew.
There's some deep irony in the fact that these so-called misinformation experts are themselves nothing more than expert purveyors of misinformation.
“… erodes trust in social science and liberal media outlets.”
That trust was forfeited long ago.
Anyone in power who has the ability to control access of information will do it. It’s a temptation. no one in authority could ever resist. It’s a lot easier to propagate a lie than to deal with an underlying, real world problem.