A practical guide for political fact-checkers
To defend his policy of separating immigrant children from their parents, Trump uttered several bald-faced lies to deflect responsibility for the humanitarian crisis he created. Although the lies he and administration officials repeated seemed to represent a new nadir even for this presidency, the constant flow of misinformation from this White House has vexed political journalists from day one. How do you cover a president who frequently utters false claims without giving credence to the misinformation?
The political press’s answer has been to double-down on fact-checking articles to correct the record. But the way the press typically goes about this is all wrong, according to scholarly insights on how misinformation persists (e.g., Cook, Ecker, and Lewandowsky 2015; Schwarz, Newman, and Leach 2016). The key to correcting misinformation lies in creating what George Lakoff calls a “truth sandwich” or adopting what John Cook describes as a fact-myth-fallacy structure for refuting false claims.
Given these impediments to countering misinformation, political journalists need to adopt a different approach to maximize the potential for setting the record straight—an approach that draws from misconception-based learning and “inoculation theory” (e.g., Cook, Lewandowsky, and Ecker 2017). John Cook and colleagues (2015) explain that an explicit warning should preface statements of misinformation, the facts that counter misinformation need to be repeated and strengthened, and corrections need to explain why the disputed claims are erroneous.
Let’s consider the example, “The facts about Trump’s policy of separating families at the border,” to illustrate how political journalists might become more effective at adopting these principles.
The article, which comes from the Washington Post’s coverage of the administration’s false claims about its child separation policy, starts by reiterating and thereby reinforcing the false claims (rather than the facts). This occurs before providing a disclaimer that the claims are false and well before introducing the facts at odds with the disputed claims. The false claims are repeated and strengthened both through direct reported speech—a series of five quotes that allow administration officials to reanimate the false claims in their own voices—and indirect reported speech as the reporter paraphrases the administration’s main claim. In other words, the article’s opening repeats and strengthens the false claims, rather than the facts. Furthermore, the article rehearses these false claims before warning readers that the claims are false. This is a crucial misstep because “strengthening of the initial misinformation seems to have a stronger negative effect than strengthening of the retraction has a positive effect” (Cook et al. 2015).
After the administration’s claims are introduced and repeated, the article states: “These claims are false.” This is a clear warning, but it is not an advance warning. The warning should precede the false claims, and both the warning and reporting of false claims should appear after a clear statement of the key fact in the case.
Next, the article explains why the administration’s claims are false. This is an important move, but the explanation falls short because it delays the presentation of the key fact needed to debunk the administration’s main claim. Compare the first statement below (the journalist’s paraphrase of the administration’s main claim) to the second statement (the key fact that refutes the administration’s claim).
- “The president and top administration officials say U.S. laws or court rulings are forcing them to separate families that are caught trying to cross the southern border.”
- “No law or court ruling mandates family separations.”
When presented side-by-side, the juxtaposition of the fact in (2) with the administration’s claim in (1) provides a succinct and straightforward refutation. However, in the article, this second statement does not occur until two paragraphs after the first (it should occur before). The intervening paragraphs unnecessarily delay presentation of this key fact, adding layers of detail (unnecessary complexity) that distract and delay readers’ ability to update their mental model of the situation.
In other words, the article starts by reinforcing an erroneous mental model (repetition of the false claims), removes a key piece of that mental model (stating those claims are false), and delays filling the resulting gap. As Cook and colleagues (2015) explain, “If a central piece of the model is invalidated, people are left with a gap in their model, while the invalidated piece of information remains accessible in memory. When questioned about the event, people often use the still readily available misinformation rather than acknowledge the gap in their understanding.”
Based on a similar look at other articles in the Washington Post and New York Times, this ineffective approach to debunking false claims seems to be quite prevalent in the nation’s newspapers of record. Why is this so? The short answer may have to do with the nature of political reporting. The objective of journalism, after all, is to report on what is happening, which includes what the president and administration figures are saying. When those figures say things that are factually incorrect, the focus tends to remain on what they’ve said. The false claims are foregrounded and become the focal point while the facts needed to refute the claims get buried.
To effectively counter false claims and misinformation, factual corrections need to enter into the intertextual web of public discourse through messages that are structured for maximum effect. This requires foregrounding and repeating what is factually correct, warning readers before introducing false claims, and unpacking the fallacies that distort the facts. Although even these best practices may fall short with strident partisans, fact-checking journalists trying to set the record straight for the rest of the citizenry would be well-served by adopting this fact-myth-fallacy structure for countering misinformation.
Adam Hodges is a linguistic anthropologist with interests in political discourse. His books include The ‘War on Terror’ Narrative: Discourse and Intertextuality in the Construction and Contestation of Sociopolitical Reality, and his articles have appeared in Discourse & Society, Journal of Linguistic Anthropology, Language & Communication, and Language in Society.
Cite as: Hodges, Adam. 2018. “How to Counter Misinformation.” Anthropology News website, July 9, 2018. DOI: 10.1111/AN.899