Understanding How We Interpret and Share Misinformation

Woman uses her phone to watch news while news plays on the TV.
Quick Take

Partisanship’s effect on misinformation • The online information landscape 

First it was Brexit. Then the 2016 U.S. presidential election, followed by the COVID-19 pandemic. Mubashir Sultan watched this historic trifecta unravel, shocked by the divisiveness and information silos that characterized each event.  

Sultan was in graduate school at the time, and he found his research interests being sculpted by the world events surrounding him—first while pursuing two master’s degrees in cultural psychology and in brain and cognitive science, then as a PhD student researching the psychology of misinformation at Humboldt University in Berlin. And Sultan wasn’t alone.  

“There has been a huge public, but also scholarly, concern regarding misinformation,” Sultan said. “If you look at the graph of the amount of papers being published on misinformation, it’s so steep starting from around 2016.” 

With so much research happening all at once, Sultan and his colleagues saw a need to take a comprehensive look at the studies that have been conducted on misinformation, especially on veracity judgements—research that asks participants to identify whether a news headline is true or false. They did just that in a paper currently in press in the Proceedings of the National Academy of Sciences.  

Sultan and his coauthors sifted through about 4,000 articles, narrowing the sample down to 31 studies conducted with about 12,000 participants. They looked at a range of demographic factors such as age and education as well as psychological factors such as analytical thinking skills, partisan bias, and motivated reasoning—a cognitive bias used to process new information in a way that fits with a specific end or goal.  

Mubashir Sultan

Using a meta-analysis of individual participant data, the team looked at about 250,000 individual choices. They also used a signal detection theory model, which allowed them to look at discrimination ability and responding bias across all 31 studies.  

“This is super cool from a methods perspective because it’s just something that doesn’t happen very often in social sciences,” Sultan said of their unique meta-analysis approach. “But it also gives us flexibility in the sense that we ask our own research questions, we set our own hypothesis.” 

Through their analysis, Sultan and his team identified several interesting trends in the demographics of who is most impacted by misinformation.  

Older adults, for example, were found to have better discrimination abilities than younger participants. Sultan surmised that having more political knowledge and more experience in general could contribute to this tendency. But previous studies have shown that older adults tend to share more misinformation, complicating this dynamic, he noted.  

Learn more about misinformation from this Science for Society webinar.

“It suggests that even though they may know whether something is true or false, they’re sharing stuff because of different motivations,” he said.  

One of the strongest findings from the study was the effect of familiarity. Participants who reported seeing information repeatedly were more likely to believe it was factual.  

“This actually is one of the scariest effects, I find, because there isn’t a lot of research to show how you can intervene against it,” Sultan said.  

Partisanship’s effect on misinformation 

APS Fellow Bertram Gawronski and his colleagues at the University of Texas at Austin performed their own review of previous research on misinformation, uncovering three myths that had been posited by previous research. Their paper is soon to be published in Current Directions in Psychological Science. 

The first myth the team debunks is that people, in general, are bad at determining what’s true and what’s false.  

Bertram Gawronski

“To be honest, when we started this work, we basically had the same assumption because, if you read that literature, if you look at media coverage on misinformation, you would naturally draw the conclusion that people are just terrible at that,” Gawronski said. 

Although people do have difficulty determining if deep fakes, as well as other types of visual content that use artificial intelligence to fool the viewer, are real, verbal information is a different story.  

In a review of data from more than 15,000 participants, researchers found evidence of high sensitivity in judging real from fake news (Pennycook & Rand, 2021). From additional research, Gawronski estimated that when given large sets of true and false information, people typically distinguish truth from falsehoods correctly about 75% of the time.  

“The longer we worked on that—we have our own data and also looked at the data from other people—we realized that people are actually surprisingly good in distinguishing between what’s true or false,” Gawronski said.  

Still, people are not perfect at this skill, and they are much more likely to misjudge the accuracy of information if the information is reinforced by or conflicts with their personal values and attitudes.  

The second myth Gawronski and his coauthors debunk is that partisan bias is not a major factor influencing an individual’s susceptibility to misinformation. Gawronski and coauthors looked at previous literature that claimed partisan bias was not an important influencing factor and analyzed the data with a new approach. They found the opposite of the paper’s original findings—the effect of partisan bias was massive (Batailler et al., 2022).  

Related content: Busting Myths in Psychological Science

“On the one hand, they can mistakenly accept or believe things that are false,” Gawronski said. “But there’s also a different kind of error—that people may mistakenly think that the things that are true are actually misinformation.” 

For their study, partisan bias was defined in broad terms. Gawronski equated their definition to the concept of myside bias, where individuals process information in a way that favors their previous beliefs and attitudes. This can sometimes be linked to political party, especially in U.S. discussions, but not in all cases.  

Sultan and his team found evidence of a similar pattern of partisan bias, though their study was more closely linked to political parties in the United States. In their study, participants who identified themselves as Republicans discriminated a false headline from a true one less frequently than those who identified themselves as Democrats.  

Though Sultan said this result was not necessarily surprising because it has been illustrated in previous studies, it warrants further research.   

“It just opens up the question: What’s exactly happening with how this information is being produced, how is it being consumed, and why are people reacting in this way?” he said. “Or what is it with the information ecosystems that Democrats versus Republicans find themselves in that is leading them to think whether something is true or whether something is false?” 

The online information landscape 

The final myth the Gawronski team debunked is the role of gullibility in whether people believe misinformation. While Gawronski agreed that gullibility is a problem, he pointed out that people are surprisingly skeptical.  

“We find that skepticism against true information content that is incongruent with people’s beliefs is three times larger than gullibility,” he said. “And so, if we think about misinformation interventions, like digital literacy and things like that, that may help against some aspects of the gullibility.” 

Jiayan Mao, a PhD student from Vrije Universiteit Amsterdam, researches conspiracy theories from a psychological perspective.  

Although Mao said misinformation and conspiracy theories are conceptually different, much of the misinformation spread online can be considered conspiracy theories.  

Jiayan Mao

“Online platforms and social media have greatly stimulated the generation of conspiracy theories and have become a vehicle for the rapid and widespread spread of conspiracy theories,” Mao said.  “Conspiracy theories are often very attention-grabbing, so people tend to share and spread this information while ignoring the truth of the event.” 

Mao and his coauthors proposed an organizational framework to help better categorize conspiracy theories, as well as the causes and consequences associated with them. Their paper, which was recently published in Current Directions in Psychological Science, introduces the acronym GIST to help classify these phenomena. GIST stands for Groups, Ideology, and Status Typology—highlighting the three major facets they explore.  

“It is necessary to distinguish between different types of conspiracy theories as such different conceptualizations have different antecedents and psychological consequences,” Mao said. “By elaborating on the three different facets as presented in GIST, we highlight the complexity of conspiracy beliefs, avoid overgeneralization of research findings, and provide direction to the domains in which conspiracy theories come into being and may persist.” 

The nature of online misinformation has its own unique challenges, but there are many potential overlaps with life in the offline world. For example, Sultan said the echo chambers and bubbles of individualized information that are often pointed to online can also be found in the physical world.  

“They are the original bubbles, actually,” he said.  

Though there are an infinite number of perspectives to engage in online, reaching those diverse points of view can require additional effort because of the silos of reinforcing ideas that online users often find themselves enmeshed within.  

And techniques like microtargeting, where companies use online data to tailor advertisements to individuals, can complicate the online landscape even more. 

Ads that use microtargeting can pick up information like an individual’s political identity and sculpt a message that fits with the values and attitudes of that group.  

“The more personalized information is, the more likely it fits with your political identity, and the more likely you are to believe it to be true,” Sultan said.  

But online venues of information can also provide nudges that help people fact-check the information they are sharing. In 2020, when X was still known as Twitter, the platform introduced a prompt for users that asked if they had read an article before they shared it. According to Twitter, users opened articles 40% more often after receiving the prompt. Some users also abstained from retweeting an article after they opened it.  

The more we understand about the ways people engage with misinformation online, the more it becomes possible to create interventions to help guide people toward factual information, Sultan said.   

“If we know somebody has a huge mistrust of government and institutions, then it doesn’t make sense to inoculate them, right?” Sultan said. “It makes sense to try and think about why is it that they have this mistrust of institutions and governments.” 

Listen to this Under the Cortex episode on information avoidance in the modern age.

Ultimately the goal for Sultan and many other researchers engaged in misinformation research is to support a functioning society where individuals can eventually come to consensus based on the information they consume.  

“We care a lot about informing citizens so they can make better choices and decisions for themselves,” he said.  

Related content we think you’ll enjoy


  • Conspiracy Theories

    What drives people to believe in vast conspiracies and dismiss facts as hoaxes? Psychological research identifies some motivations.

  • This is a photo of a piece of paper torn to reveal the phrase "uncover the facts"

    Myths and Misinformation

    How does misinformation spread and how do we combat it? Psychological science sheds light on the mechanisms underlying misinformation and ‘fake news.’

  • llustration of young people using mobile smartphone and tablets

    Digital Media

    Psychological scientists are exploring how we use digital media and the consequences, both positive and negative, it can have in everyday life.

Back to top

Feedback on this article? Email [email protected] or login to comment.

References

APS regularly opens certain online articles for discussion on our website. Effective February 2021, you must be a logged-in APS member to post comments. By posting a comment, you agree to our Community Guidelines and the display of your profile information, including your name and affiliation. Any opinions, findings, conclusions, or recommendations present in article comments are those of the writers and do not necessarily reflect the views of APS or the article’s author. For more information, please see our Community Guidelines.

Please login with your APS account to comment.