The 9/11 terrorist attacks occurred when I was nine years old. In the days afterward, I remember obsessively reading everything I could about the attacks in my dad’s Newsweek magazines. A few years later, I stumbled upon the film Loose Change, which purported to uncover the truth behind 9/11. At first, I was skeptical. Of course, Osama bin Laden was behind the attacks—that’s what I’d read in Newsweek. But as I kept watching the film, I felt a growing sense of doubt. By the time the credits rolled, I was sure that 9/11 was an inside job, orchestrated to create a pretext for a war in Iraq and subsequent war profiteering.
As I grew up and gained more in the way of critical-thinking skills, I eventually came to recognize that the theories being pushed by 9/11 skeptics were delusional. That earlier flirtation with conspiracy theories in my adolescence has made particularly sensitive to the rash of similar falsehoods that are being widely peddled today—most notably, the pervasive belief that the 2020 election was stolen, which led directly to the insurrection at the U.S. Capitol on January 6. What motivates people to believe in these blatantly bogus conspiracy theories?
According to the social psychologist Karen Douglas at the University of Kent, social crises produce noticeable upticks in the prevalence of conspiracy theories. People engage in increased attempts to make sense of the crisis in order to bring meaning and purpose to an otherwise chaotic and fearful situation. “A conspiracy theory helps people to make sense of the world by specifying the causes of important events, which further helps them predict, and anticipate, the future,” write Douglas and her collaborator Jan-Willem van Prooijen, who describe how wildly different events like the assassination of U.S. president John F. Kennedy and the fire that destroyed Rome in the first century generated similar outbreaks of conspiracy theorizing. Experiencing a crisis may cause feelings of powerlessness, and believing in a conspiracy theory allows the believer to feel some measure of control over the situation, the researchers note.
Beyond these crisis-specific surges, Douglas identifies three motivations that drive belief in conspiracy theories. Epistemic motivations are characterized by a desire for more information about a given phenomenon—such as a causal explanation of an event—especially when the existing knowledge is spotty or distrusted. When “events are especially large in scale or significant and leave people dissatisfied with mundane, small-scale explanations,” Douglas writes, people’s belief in conspiracy theories appears to intensify. Those who subscribe to these theories, in turn, feel that they know the real “truth” about an event, and their beliefs are particularly difficult to refute because anyone who challenges their beliefs is believed to be part of the conspiracy to withhold information. If epistemic motivations arise from uncertainty, existential motivations emerge from feelings of anxiety or powerlessness. Believing that they possess special or secret information can give people a superficial sense of control over whatever bewildering situation they find themselves in, Douglas argues.
We can see these two motivations at work in the conspiracy theories that have proliferated over the past year. As the coronavirus spread rapidly and the world went into lockdown in early 2020, there was little information about the virus. People were frightened about both the dangers of Covid-19 and the potential economic fallout of the lockdowns. Predictably, there was a corresponding uptick in conspiracy theories: Covid-19 was engineered by the Chinese government or caused by 5G cell towers, or it was a ploy to microchip the population via a new vaccine. In the short term, these conspiracy theories may have provided believers with a sense that they had control over their frightening new reality—even though social experiments have shown that exposure to conspiracy theories ultimately reduces feelings of autonomy and control.
A third motivation for believing in conspiracy theories is social: people feel a sense of superiority when they believe that they have information that others don’t. A common rhetorical device used by groups that peddle conspiracy theories is the claim that other people are sheep, and that their group alone has access to the real truth. From this vantage point, members of the in-group are morally superior to everyone else, and any deviations from the predictions of the theory occur because of sabotage by outsiders.
The virulent support for QAnon, a disproven conspiracy theory whose adherents have been identified as a domestic terrorism threat by the FBI, seems to fall in this latter category. QAnon supporters believe in an utterly convoluted and outlandish narrative: a global cabal of satanist pedophiles rules the world, and former U.S. president Donald Trump is fighting a secret war against this cabal—only pretending to be an inept leader to disguise his heroic machinations. The stories that QAnon supporters tell are rich with apocalyptic Christian tropes, describing how Trump and his followers will arrest the cabal’s members during a Rapture-like event, the “Storm,” which will usher in a utopian new order.
QAnon members would have remained ignorant to these spectacular “truths” were it not for an anonymous individual known as “Q,” who made specific predictions online—stating at one point that the “Storm” would take place on November 3, 2017. After these predictions failed, Q began posting vague clues, which adherents call “Q drops” or “breadcrumbs.” Self-described “bakers” have sought to decipher these clues on their own. As noted earlier, the social motivation to believe in conspiracy theories draws upon people’s desires to feel superior to others. QAnon followers do not want to be told what to think; they want to discover the truth in Q’s messages by themselves. By allowing the movement’s followers to take part in authoring the story, Q—whoever this person really is—has deepened their feelings of loyalty and commitment to the group.
Support groups exist for those who have lost friends and family members to QAnon (on Reddit, r/QAnonCasualties is one such community), and some organizations, like Parents for Peace, are attempting to “deprogram” people steeped in such conspiracy theories. Nevertheless, the research shows that once someone develops strong beliefs of these kinds, it is very hard to change them. The best approach is to inoculate people with factual information before they go down the rabbit hole of conspiracy theories. (Not surprisingly, younger people and those with less education are more likely to believe in conspiracy theories.) To “pre-bunk” Covid-19 conspiracy theories, Cambridge University researchers have worked with developers to create online educational games like Go Viral! and Bad News. Go Viral! teaches players about the strategies being used to “spread false and misleading information” about the coronavirus. In Bad News, players build a fake news empire by using a variety of tactics to disseminate disinformation. The creators of these game hope to deter players from falling for conspiracy theories and other forms of disinformation by helping them see how easily and maliciously these lies are created and spread.
Lynn Barlow Lynn Barlow is a writer based in Asheville, North Carolina. When she’s not writing, she can be found skiing, whitewater kayaking, or playing fetch with her dog Urza. Instagram: @biggitygnar
- Follow us on Twitter: @inthefray
- Comment on stories or like us on Facebook
- Subscribe to our free email newsletter
- Send us your writing, photography, or artwork
- Republish our Creative Commons-licensed content