The effect of conspiracy theories

“Conspiracy,” a photo by Fleeting Pix. Colorized and digitally altered by Tucker Lieberman.
Wikimedia Commons, CC 3.0 license.

How do conspiracy theories arise? Why, despite how implausible they sound to most people, are they so “sticky” for others?

Telling stories that aren’t true

If Only by Neal Roese.

Neal Roese, in If Only: How to Turn Regret Into Opportunity (2005), discusses the role of counterfactual expressions—that is, things that just aren’t so. At their best, they help us analyze a situation and seek a better path. One type of counterfactual is “it could have been worse” which is supposed to serve as consolation.

Here’s one of Roese’s examples. An employee of Cantor Fitzgerald—a company that suddenly lost hundreds of employees in New York City when the World Trade Center fell on September 11, 2001—survived because he happened to be inquiring about a gym membership and was not in the office when the plane hit the building. The counterfactual narrative that he easily might have died does not meaningfully explain why he lived. The simple observation of his near-brush with death, applied to this situation of survivor’s guilt and when taken up as an existential perspective, “is a counterfactual that shoots blanks,” Roese says. Such an approach “can get in the way of successful coping by conjuring phantom explanations and phony sense making or simply by failing to provide resolution and understanding.”

The man’s survival is random, yet that answer leaves most of us itching. Some will contort themselves to come up with a different explanation.

What existential function might a conspiracy theory serve?

A conspiracy theory—pick one, any one—is, in my view, a more elaborate kind of counterfactual. It asserts itself to be true, or at least plausible and meriting more inquiry, but it is not true. Like other counterfactuals, it serves the need to point out unresolved questions and find some way to make sense of the world.

Power Corrupts: “Conspiracy Theories.” Launches May 2, 2019.

This is explored in the “Conspiracy Theories” episode of the Power Corrupts podcast that launches today (May 2, 2019) on iTunes, Spotify, RadioPublic, and Stitcher. Brian Klaas, the podcast writer and narrator, says that the tendency to adopt conspiracy theories

“seems to be part of a coping mechanism: a human instinct to deal with large, unexpected, and often tragic events. Sometimes things just happen randomly; not for any reason, not because of sinister forces. And in human psychology, randomness is much more threatening than discernible causes, even if those causes are shadowy or sinister.”

Paranoid by David J. LaPorte.

We tend to want to believe that Someone (or Something) is calling the shots and that what happens to us (or to our known world) matters within some grand plan.

Conspiracy theories are often products of paranoia. A paranoid person believes that “you can’t trust what you see, so you need to interpret and see behind the surface presentations of situations,” David J. LaPorte wrote in Paranoid: Exploring Suspicion from the Dubious to the Delusional (2015). Such people report experiencing a “sudden clarification,” which feels as if they “immediately recognize [an event] for ‘what it really is.’” Their sudden clarification feels true even if it is not.

A believer in a conspiracy theory, Klaas says, is “choosing to discount evidence and rational thought in favor of snippets of ‘What if?’ speculation.” In this case, unfortunately, “the normal way of convincing someone of an idea by presenting rational thought and evidence just isn’t very effective.” It is hard to persuade someone to abandon these theories. They are constructed in such a way that they cannot be falsified, and criticism only triggers a paranoid person’s suspicion of outsiders.

I have never knowingly been a conspiracy theorist on any matter. Generally, such stories are repugnant to my occasionally obsessive fact-checking habits, to my worldview in which ethics does not reduce to a battle between good and evil, to my personality that tends to be more trusting and less paranoid, and to the social bonds I form with people whose attitudes are similar to my own.

I do, however, see how conspiracy theories might appeal to someone else. Counterfactuals more generally—the past that wasn’t, the future that isn’t yet—are “entertaining,” according to Roese, because they are imaginative variations on a known theme, and they are “cognigenic, meaning that they spur further creative thought.” I suggest that conspiracy theories, too, fit this description. They are intricate fictions and mostly self-contained worlds. If I were to allow myself to spend time with one and if I were to engage it on its own terms, I could see myself growing fond of it.

One of Klaas’ interviewees for Power Corrupts says that believing in a conspiracy theory predisposes one to begin believing in yet another, even if the two theories are unrelated or contradictory. Klaas describes conspiracy theories as having “a weird way of metastasizing: they morph as they spread; they grow more outlandish; the conspiracy gets weirder and weirder as people build on the unhinged beliefs of others.” For this reason, to me, such stories feel a bit dangerous, like ideological gateway drugs, and I have always avoided them when I recognize them.

What we become

At the end of the road of a multitude of conspiracy theories, a person may be well trained in the consistent rejection of logic.

Denialism by Michael Specter.

According to Michael Specter, author of Denialism: How Irrational Thinking Harms the Planet and Threatens Our Lives (2009), the rejection of science is a coping strategy for living in an increasingly technological society that every day becomes a little harder to understand. When people are fearful and “decide that science can’t solve their problems,” they may abandon scientific process and findings, gravitating instead toward some other answer on the merits of its perceived popularity. This is a problem: “Either you believe evidence that can be tested, verified, and repeated will lead to a better understanding of reality,” Specter warns, “or you don’t. There is nothing in between but the abyss.”

In politics, similarly, embracing a multitude of conspiracy theories may lead a person to distrust and reject democratic principles. Ultimately, experts are not believed; leaders are not trusted; process is not given credibility; norms are not understood; facts cannot be verified; no one can be held accountable. This is a terrible outcome, but it is hard to stop conspiracy theories from starting and spreading. Perhaps being aware of their psychological function can prompt us to think of other ways to confront the human fear of random, small, and impersonal causes.

7 thoughts on “The effect of conspiracy theories

  1. Welcome, Tucker. Interesting and timely post.
    “A conspiracy theory—pick one, any one—is, in my view, a more elaborate kind of counterfactual. It asserts itself to be true, or at least plausible and meriting more inquiry, but it is not true. Like other counterfactuals, it serves the need to point out unresolved questions and find some way to make sense of the world.”
    Counterfactual defined: thinking about what did not happen but could have happened, or relating to this kind of thinking . . . but conspiracy theorists offer their theories up as true, don’t they? They are counterfactual but parading as fact.
    In any case we are surrounded by these claims and they do have an effect on people’s beliefs.

    Liked by 2 people

  2. Indeed! That is a good point. It is important to distinguish whether we are aware or unaware of our errors. Though I do think human brains are often fuzzy about that awareness! (Which is why we like fiction.)

    When we’re conscious of our factual errors, we may allow them to remain as a form of play or experimentation. I think there’s a spectrum of how conscious we are, and it can change over time. Sometimes we begin with accidental errors and contradictions of which we later become aware, but it can take a while to let go of them (e.g. someone realizes they “can’t” believe in both evolution and creationism, and they have to decide when they’re ready to let go of one). Other times, we begin with a deliberate “What if…” and gradually forget that it’s only a story.

    I can think of one episode from my life when I intensely wished that a certain event hadn’t happened. I created a complex narrative around it because I desperately needed to give the event some personal meaning. As months and years passed and I gained clarity, it became increasingly apparent to me that this particular narrative of mine was only lightly connected to reality. The original event had been real, but my How-This-Happened-And-What-It-Means story wasn’t.

    (This may also be related to what I’ve heard called the “unreliability of eyewitness testimony.” People believe they remember exactly what they saw and heard when a crime was committed, but they don’t. They remember their own fictionalized version of their memory. Much of memory turns out to be like that, a patchwork of confabulation.)

    I think this is a common reaction to trauma and grief; I’m thinking of Elizabeth Kübler-Ross’ five stages of grief that include “denial” and “bargaining.” If someone dies (or there’s a similarly final event), on some level we know that we can’t change it—and yet psychologically we deny and bargain for a different outcome! When we’re in pain, the brain scrambles to make the pain stop or to resolve to avoid similar pains in the future, which is emotionally motivated reasoning, and we’re not able at that time to dispassionately and objectively verify what’s really true. An objective approach might be “Lightning struck; I can’t control whether it happens again; let’s move on.” If we make up a false story (“Zeus hit me, I deserved it”), reinforce it by repeating it, form new relationships around it, build up an identity around it, etc., it becomes hard to let it go.

    Maybe that’s part of what happens to conspiracy theorists. They might begin with fear or anger about a specific frustration, propose a story to gain a sense of control, and then do not want to give up the story. Some must be aware of the contradictions of related theories (all the possible bullet trajectories in the JFK assassination, and they can’t all be true), and many seem to have a sense of humor about the goofiness of their theory (I’m thinking of Flat Earthers), which means that, on some level, they must be conscious they’ve made an error somewhere. But that doesn’t mean they’re going to give up on their story! I suspect that, the more we become aware that we’re straddling fact and error and the less we want to budge, the louder we’re going to yell to try to persuade other people. If we can persuade other people that we’re right, then we don’t have to be the ones to move.

    Liked by 1 person

  3. Pingback: When to wager that a conspiracy theory is false | Episyllogism

  4. Pingback: Objective discovery, subjective interpretation? | Episyllogism

Please join the discussion!

Please log in using one of these methods to post your comment: Logo

You are commenting using your account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s