Why People Believe False Stories and Disinformation

By Rosa Rubicondior | 25 June 2022
Rosa Rubicondior Blog

The problem is due to the way we have evolved to deal with information. (Credit: YouTube / screengrab)

Why We Fall for Disinformation | Psychology Today

In a report published recently by the US Center for Naval Analysis (CNA) a team of psychologists analysed the reasons why so many people are falling for disinformation. The problem is due to the way we have evolved to deal with information and the fact that this has ill-prepared us to deal with the vast amount of information now being directed at us by modern technology.

In fact, some of the time-tested tools make us dangerously vulnerable to disinformation, especially disinformation designed to mislead and garner support for extremist groups for whom the truth would be toxic. We see this today in the form of disinformation about, for example, COVID-19, the measures to reduce its spread and the vaccines designed to protect us from it. We also see it in relation to politics, political movements and parties, international affairs, religious fundamentalism and anti-science propaganda, such as climate change and evolution, and especially conspiracy theories such as those promulgated by QAnon and former President, Donald Trump’s supporters, intended to radicalise, undermine confidence in institutions, and garner support for extreme solutions to non-existent problems.

In other words, disinformation campaigns are designed to benefit those whom the report calls ‘malign actors’, for whom the truth would be dangerous and who know they need their target marks to believe falsehoods and mistrust the evidence.

In the abstract to their report, the psychologists, Heather Wolters, Kasey Stricklin, Neil Carey, and Megan K. McBride, say:

Abstract

Disinformation is a growing national security concern with a primary impact occurring in the mind. This study explores the underlying psychological principles facilitating the absorption and spread of disinformation and outlines options for countering disinformation grounded in in cognitive and social psychological literature. The report serves as an introduction to the topic for policy- and defense-decision makers with no prior background in psychology or disinformation. It first examines what disinformation is and why it is important before turning to the four key psychological principles relevant to this topic. A key finding is that the principles themselves are neutral and normal cognitive processes that malign actors can exploit to achieve their own objectives.

They go on to say:

Today, messages of persuasion are not just on billboards and commercials, but in a host of non-traditional places like in the memes, images and content shared online by friends and family. When viewing an Oreo commercial, we can feel relatively confident that it wants to persuade us of the cookie’s excellence and that the creator is likely Nabisco. The goals of today’s disinformation campaigns are more difficult to discern, and the content creators harder to identify. Few viewers will have any idea of the goal or identify of the creator of a shared meme about COVID-19 vaccines. And since this content appears in less traditional locations, we are less alert to its persuasive elements.

They identify four key mechanism that make people vulnerable to disinformation:

  • Initial information processing: Our mental processing capacity is limited; we simply cannot deeply attend to all new information we encounter. To manage this problem, our brains take mental shortcuts to incorporate new information. For example, an Iranian-orchestrated disinformation campaign known as Endless Mayfly took advantage of this mental shortcut by creating a series of websites designed to impersonate legitimate and familiar news organizations like The Guardian and Bloomberg News. These look-alike sites were subject to less scrutiny by individual users who saw the familiar logo and assumed that the content was reliable and accurate.
  • Cognitive dissonance: We feel uncomfortable when confronted with two competing ideas, experiencing what psychologists call cognitive dissonance. We are motivated to reduce the dissonance by changing our attitude, ignoring or discounting the contradictory information, or increasing the importance of compatible information. Disinformation spread by the Chinese government following the 2019 protests in Hong Kong took advantage of the human desire to avoid cognitive dissonance by offering citizens a clear and consistent narrative casting the Chinese government in a positive light and depicting Hong Kong’s protestors as terrorists. This narrative, shared via official and unofficial media, protected viewers from feeling the dissonance that might result from trying to reconcile the tensions between the Chinese government’s position and that of the Hong Kong protestors.
  • Influence of group membership, beliefs, and novelty (the GBN model): Not all information is equally valuable to individuals. We are more likely to share information from and with people we consider members of our group, when we believe that it is true, and when the information is novel or urgent. For example, the #CoronaJihad hashtag campaign leveraged the emergence of a brand new disease — one that resulted in global fear and apprehension — to circulate disinformation blaming Indian Muslims for the its [sic] origins and spread.
  • Emotion and arousal: Not all information affects us the same way. Research demonstrates that we pay more attention to information that creates intense emotions or arouses us to act. That means we are more likely to share information if we feel awe, amusement or anxiety than if we feel less-arousing emotions like sadness or contentment. Operation Secondary Infektion, coordinated by the Russians, tried to create discord in Russian adversaries like the U.K. by planting fake news, forged documents and divisive content on topics likely to create intense emotional responses, such as terrorist threats and inflammatory political issues.

The authors go on to recommend certain actions to safeguard against falling for disinformation, most of which will be readily familiar to sceptics and those familiar with the scientific method, which is designed to catch and filter out false information and false conclusions by looking dispassionately at the evidence and allowing opinions to flow from that evidence.

The actions they recommend, details of which can be read in the report (pages 30-34) are:

  • Bolstering your resistance to disinformation
  • Bolstering resistance to disinformation in others

Most people who have spent any time debating online with creationists and conspiracy theorists will be familiar with the role of cognitive dissonance in why people will often believe things that are demonstrably untrue whilst rejecting opposing views that are based on sound, demonstrable evidence. The authors describe the role of cognitive dissonance in the spread and acceptance of fake news and disinformation, with:

Cognitive dissonance theory

Cognitive dissonance happens when a person is confronted with two competing thoughts. For example, a person might simultaneously think the following: Exercise is good for my body; when I exercise, it hurts. It is uncomfortable to hold two competing ideas/beliefs at one time. Therefore, people are motivated to reduce the conflict or remove the dissonance. Dissonance theory describes how people are influenced to either accept or reject beliefs, as well as the information/arguments that accompany those beliefs. The theory includes both cognitive and emotional components. It posits that people feel uncomfortable when they have to reconcile conflicting information. Conflicting information is dissonant, whereas nonconflicting information is consonant, consistent, or compatible…

When information is incompatible with our beliefs, we react in one of four ways: (1) adding new, consonant cognitions, (2) removing the inconsistent information, (3) reducing the importance of opposing information, or (4) increasing the importance of compatible cognitions. Festinger’s classic example was of smokers encountering information indicating that smoking was bad for their health. In this case, the smoker has four options:

  1. Change behavior or adopt new attitude (e.g., stop smoking) (adding new, consonant cognitions).
  2. Continue to believe that smoking is not bad for health (remove the incompatible information).
  3. Compare risk from smoking to risk from something worse, such as auto accidents (reducing the importance of opposing information).
  4. Think about the enjoyment of smoking and its good effects (increase the importance of compatible information and that it might assist with weight control).

The reason conspiracy theorists, creationists, flat earthers, climate change deniers and people who think Donald Trump was a good president, expend so much effort building cults and recruiting new cult members can be found in the way group membership aids in the spread and acceptance of fake news and disinformation.

False information then can be accepted due to a combination of cognitive dissonance in the belief and acceptance require far less intellectual effort than does scepticism and fact-checking, especially if the false information is consistent with prior beliefs (the rabbit hole effect, where a person’s view of reality is conditioned by the deliberate exclusion of unwanted information. It is then reinforced by group affiliation, in other words, by formal or informal membership of a cult of like-minded people. The fact that other members of the group also believe the same false information makes it more likely that it will be believed and passed on, especially if the group is centred around believing that false information in the first place. We see this with QAnon, Trumpanzee and creationist cults, for example, where disbelief, or even expressing doubt, can be met with ostracism, online abuse and threats and kudos is gained by making the retelling more lurid and sensational.

And we see this in religious fundamentalism, where plainly irrational and evidence-free beliefs are tenaciously held onto and defended for no better reason than that they are consistent with previously held beliefs inherited from the prevailing culture, because friends and family also believe them and because doubt, dissent and disbelief might well have dire social consequences if not actual physical ones.

Rosa Rubicondior (a pseudonym) is a retired data analyst, biologist, blogger, author and atheist.

The connection between White evangelical Republicans and QAnon

‘Fake News’ explained: How disinformation spreads

Why do people believe in misinformation?

How Disinformation Is Taking Over the World | NYT Opinion

LEAVE A REPLY

Please enter your comment!
Please enter your name here