26
Tue, Nov

QAnon Doesn’t Make Sense, but its Popularity Does

IMPORTANT READS

CONSPIRACY THEORIES-The Eye of Providence on the back of the $1 bill has become symbolic of secret societies and conspiracy theories more generally. 

The recent increase in media references to QAnon over the past couple of months probably has many Americans wondering the same thing: how could anyone possibly believe this nonsense? After all, this complex and ever-evolving conspiracy theory “hangs on a core belief that the world’s levers of power are wielded by prominent liberals and celebrities who are literally pedophiles and cannibals and who kill and eat children, and that Trump has been anointed as a savior who will render justice and liberate the innocent,” as Salon’s Roger Sollenberger wrote recently. 

Despite its nonsensical premise, QAnon has purportedly attracted millions of followers. Understanding the allure of this and other conspiracies theories has become urgently important, given their potential impacts and how rapidly they can spread on social media. Look no further than the recent controversies that have erupted over basic public health measures enacted during a pandemic to see the very real consequences of a sizeable portion of the country believing and promoting baseless narratives. 

But the very notion that QAnon is taken seriously by millions of Americans is almost as baffling as its premise. How can preposterous conspiracy theories like this one nevertheless be so seductive?

As a doctoral student at Duke University’s Nicholas School of the Environment, I studied the intersection of cognitive psychology and environmental policy, and how people approach decisions and perceive risk. The extensive interviewing that I conducted as part of my dissertation research taught me a great deal about the political and emotional factors that influence people’s perceptions of reality, and their willingness to engage with conspiracy theories. 

It may be tempting to dismiss conspiratorial thinking as simply arising from bad information, and to assume that a little evidence (or mockery) will suffice to dispel such wayward notions. But anyone who has tried to talk a friend or family member out of these kinds of beliefs using facts knows that it is not so easy. That’s because a conspiracy theory’s appeal has less to do with its premise or details, and more to do with psychology — which also explains why most people who believe one conspiracy theory also buy into others. 

No matter how logical we consider ourselves, and no matter how certain we are that our views are grounded in facts and reason, emotions and cognitive biases still play a significant role in how we perceive the world and make decisions. Research suggests that people are drawn to conspiracy theories to meet certain psychological and emotional needs. For example, conspiracy theories are more attractive when people are afraid or feel a loss of control. Simple, understandable narratives that “connect the dots” are more appealing to an anxious mind than coincidence and randomness. 

In fact, studies have demonstrated that an increased tendency to see patterns in random noise corresponds with a stronger need for control, and a discomfort with ambiguity. These individuals, not surprisingly, are more predisposed to conspiratorial thinking. The idea that a disaster, or other significant event, is the work of a network of powerful individuals is somehow less threatening to some people than that event arising randomly, which may explain why many people were quick to believe, for example, that COVID-19 was intentionally created and spread. 

Those who feel as though their cultural or political tribe is under threat are also particularly susceptible to conspiratorial thinking. People long for reassurance that their groups are the good guys, and a simple “heroes vs. villains” narrative resonates emotionally in a way that nuanced policy preferences never will. For some political conservatives who feel as though their vision of America is under attack by the wicked forces of liberalism, QAnon offers the kind of reassurance that is simply too good to pass up. Not only did QAnon spin Robert Mueller’s investigation of Trump’s 2016 campaign into something positive (the investigation was actually a cover for Mueller’s and Trump’s joint efforts to stop the child-torturing cabal), but it also literally demonizes Democrats and Hollywood celebrities. 

In addition to offering a refuge from threats, belief in a conspiracy theory can also confer on someone a sense of individual importance — the feeling of being among the few who really get what’s going on. At the same time, being one of a select group also provides a sense of belonging that might otherwise be lacking. The popularity of flat earth society conferences, for example, suggests that, at least for some people, the desire for community might be motivating conspiratorial belief, rather than the other way around. Likewise, the sheer number of QAnon groups on Facebook reflects the powerful allure of feeling accepted and appreciated. But that positive reinforcement might also contribute to the difficulty of disabusing people of these beliefs: losing the belief risks losing the sense of belonging. 

But regardless of what initially draws people to a particular conspiracy theory, once they have accepted its premise as plausible, confirmation bias will likely ensure that any subsequent “investigating” will only serve to reinforce the perceived legitimacy of its premise. Confirmation bias is one of the most common, and most pernicious, cognitive biases. We instinctively seek confirming evidence when testing an assumption, but often fail to correspondingly seek falsifying evidence, which, of course, is critical to establishing validity. Predictably, someone who only searches for information that supports a premise will tend to conclude that it must be true. For example, anti-vaxxers are notorious for defending their beliefs by claiming to have conducted their own research, which strongly suggests that confirmation bias is playing a role. Similarly, QAnon followers themselves have referred to having gone “down a rabbit hole” after being introduced to it, underscoring how confirmation bias can steer people from mere openness towards hardened belief. 

Once embedded, conspiratorial beliefs are frustratingly difficult to dislodge, no matter how outlandish they may be. When presented with evidence that contradicts the theory, believers are unlikely to reevaluate their beliefs, but will instead engage in motivated reasoning to reconcile that evidence with their pre-existing conclusions. They might simply dismiss contradictory evidence (fake news!), or they might engage in a kind of mental gymnastics that leaves them certain that the new information doesn’t actually challenge their beliefs and might even support them. 

For example, the fact that QAnon’s predictions have repeatedly failed to come true does not deter its believers in the least, in large part because they consider its basic premise to be incontrovertible fact. Any new information must either support that conclusion or be dismissed. The same response can be seen from flat earthers, who have shown remarkable resistance to the overwhelming evidence against the earth being flat. In a way, motivated reasoning functions like a missile defense system that prevents inconvenient facts from landing. 

So, if contradictory evidence has little impact on belief, can anything be done to mitigate it? Social media companies like Facebook and Twitter have recently begun removing QAnon content and groups but will nevertheless struggle to keep up with its spread. Although social media companies should absolutely continue doing so, it is important to note that such removals do carry the risk of triggering a boomerang effect that might end up reinforcing conspiratorial beliefs: after all, why would these powerful actors remove groups or label posts as misinformation unless they are threatened by the truth? And, as previously noted, mocking QAnon followers, or engaging them in online debate, will do little to dislodge their beliefs, and might even reinforce them. 

Conspiracy theories offer simple explanations for complex situations. Unfortunately, there is no such simple means for breaking their hold on believers. But while efforts to reach believers en masse will likely prove ineffective, individuals might have more success in one-on-one encounters. Understanding the psychological factors that drive someone’s conspiratorial beliefs will offer clues as to how to approach them. When dealing with friends or family members, for example, listening, empathizing, and then gently offering a perspective that speaks to their underlying emotional motivations can actually be very effective in softening their beliefs. 

(Dr. Sean Lonnquist is a 2020 graduate of Duke’s Nicholas School of the Environment, where he studied cognitive psychology and environmental policy. This piece appeared on Medium.com.)  Prepped for CityWatch by Linda Abrams.

 

 

 

Get The News In Your Email Inbox Mondays & Thursdays