Intertextuality and the propagation of disinformation
Propaganda typically refers to manipulative techniques and misleading messages used to gain public acquiescence for a political cause, especially during times of war. Over the past century, George Orwell, Harold Lasswell, Jacques Ellul, and Edward S. Herman and Noam Chomsky, among others, have written or theorized about propaganda. But these approaches were developed when traditional forms of mass media represented the vanguard of message dissemination.
How are we to make sense of today’s propagandistic messages, which seem to increasingly occur online in the form of fake news? Facebook election ads created and promoted by Russian trolls. Tweets that accused students at Marjory Stoneman Douglas High School of being “crisis actors.” A president who invents facts to bolster his worldview. These and other examples are the twenty-first century’s version of propaganda, and they are aimed at spreading disinformation and sowing ignorance, division, doubt, and fear.
We need a twenty-first century theory of propaganda to make sense of today’s disinformation campaigns, whether they emanate from Russian troll farms or the Twitter feeds of the president and his cheerleaders. In a forthcoming book, The Discourse of Propaganda, John Oddo provides the theoretical purchase needed to analyze propaganda in the digital age.
Central to Oddo’s theory is recognition that propaganda—which he defines as manipulative and antidemocratic discourse that “harms the Many and serves the Few”—involves an intertextually rich communicative process “that requires contributions from multiple agents. It can succeed in circulating only if it continually induces new audiences to recognize and recontextualize it on a mass scale.” Although Oddo illustrates his theory using case studies from the two US-led wars in Iraq in recent decades, the ideas are tailor-made for analyzing current events.
Propaganda ultimately relies upon the recontextualization of messages to gain traction and propagate. The social media ads placed by Russian trolls are a case in point. By inducing “likes” and “shares,” ads like these hold the potential to go viral, much like the conspiracy theory tweets about Marjory Stoneman Douglas High School students being “crisis actors.”
Greg Urban (1996) points out that “some kinds of discourse are intrinsically more shareable than others.” So, what makes certain messages more prone to being propagated? Oddo points to Bauman and Briggs’s (1990) explanation of how “performative semiotics play an important role in rendering discourse extractable,” citing parallelism, repetition, and dramatic pauses as examples of poetic devices that make discourse more likely to be entextualized and repeated. In the online world, one should add that shock value enhances the intrinsic worth of tweets and posts. The more deliberately offensive or provocative a post, the more likely it will be shared. Trolls thrive off this maxim.
Propaganda must also be detachable and mobile. Social media platforms build this capacity into the technologies, making retweeting, reposting, and sharing messages easy to achieve with a simple click. Automated propaganda bots—created and controlled by human propagandists—can help circulate messages online, amplify the number of shares, and catapult messages to trending topics. Bot creators sometimes create hashtags and “use their bots to amplify them until they’re adopted by human users,” Erin Griffith writes in Wired. “Over time the hashtag moves out of the bot network to the general public,” explains Ash Bhat, a computer science student at the University of California Berkeley who helped start a project to track propaganda bots.
Classical views of communication and propaganda have overemphasized both as vertical processes. In the classical view, as Debra Spitulnik (1996) describes, “There is a privileging of a one-way directionality from a mass communication form to the masses, who supposedly receive it and consume it.” While this may have made sense in the age of television when classical models of communication were formulated, the age of social media exposes the inherent flaws of those models.
In contrast to the vertical, Spitulnik discusses the importance of lateral forms of communication, and Oddo similarly emphasizes the importance of horizontal propaganda: “propaganda spread collectively by a diffusion of participants.” Oddo discusses the role of deliberate vertical propagandists, but his focus on horizontal propaganda allows for unwitting actors to play a crucial role in the diffusion of messages. The humans behind propaganda bots may be involved in a deliberate campaign to disseminate manipulative messages (vertical propaganda), but the success of the campaign requires the participation of others who help to collectively circulate the messages (horizontal propaganda).
In the case of the conspiracy tweets about Marjory Stoneman Douglas High School students, the horizontal propagandists included high-profile individuals such as Donald Trump Jr. who liked the tweets and Rush Limbaugh and other pundits who recontextualized versions of the accusations on television. In the case of Russian-led pre-election propaganda, the New York Times reports, “While most of the Americans duped by the Russian trolls were not public figures, some higher-profile people were fooled.” Robert Mueller’s indictment of 13 Russians mentioned how one Russian Twitter feed, @TEN_GOP, “attracted more than 100,000 followers. It was retweeted by Donald Trump Jr.; Kellyanne Conway, the president’s counselor; Michael T. Flynn, the former national security adviser; and his son, Michael Flynn Jr.”
Lest you get too comfortable thinking you are immune from spreading misleading or manipulative messages, keep in mind that sharing outrageous messages simply to point out their absurdity—like the conspiracy theories about the high school students—helps give those messages traction. “The well-intentioned also, inadvertently, participate in the cycle of making a conspiracy theory go viral,” Abby Ohlheiser reminds us in the Washington Post. “The thing about sharing your outrage over a despicable idea is that it’s still a share.”
Making sense of propaganda in the age of social media requires recognizing with Oddo that “propaganda is a distributed activity—a dialogic process.” In other words, “it is not quite accurate to speak of a single propagandist who intentionally delivers a self-serving message to the masses.” We are all part of the intertextual web of influence that comprises our democratic society. How we use our voices in that intertextual web, though, is ultimately up to us.
Adam Hodges is a linguistic anthropologist with interests in political discourse. His books include The ‘War on Terror’ Narrative: Discourse and Intertextuality in the Construction and Contestation of Sociopolitical Reality, and his articles have appeared in Discourse & Society, Journal of Linguistic Anthropology, Language & Communication, and Language in Society.
Cite as: Hodges, Adam. 2018. “A Theory of Propaganda for the Social Media Age.” Anthropology News website, April 9, 2018. DOI: 10.1111/AN.823