It’s hard to venture online these days—or switch on any cable network—without coming across a heated discussion over “fake news.” Basic facts and figures, ranging from crowd sizes to poll numbers to whether or not it rained, now appear to be under negotiation. For many media consumers, it can feel as if we are living through an entirely new dystopian era, with each news cycle or press conference sending us further down the rabbit hole.
But although the term “fake news” reflects our troubled political moment, the phenomenon is nothing new, and neither is the psychology that explains its persistence.
“There’s a tendency for people to say, ‘Well, given the social media channels we have now, these things can spread more quickly and have a greater effect than ever before,” says Adam Waytz, an associate professor of management and organizations at the Kellogg School. “There’s actually more to it than that. Many of us remember when the most prominent news outlets in the world were reporting that Iraq might have weapons of mass destruction. That was before Facebook and Twitter.”
To understand how people in the same country, or same family, can have such vastly differing takes on reality, Waytz suggests we should focus not on the role of social media, but on the role of social psychology—in particular, the cognitive bias that stems from our tribal mentalities. For Waytz, before we can learn to address our divisiveness, it is important to understand its roots.
“There’s an assumption that fake news exacerbates polarization,” Waytz says. “But it might be the case that polarization exacerbates fake news."
The Many Flavors of Truth
To help explain our enduring susceptibility to fake news, Waytz points to two well-known psychological concepts. The first is “motivated reasoning,” the idea that we are motivated to believe whatever confirms our opinions.
“If you’re motivated to believe negative things about Hillary Clinton, you’re more likely to trust outrageous stories about her that might not be true,” says Waytz. “Over time, motivated reasoning can lead to a false social consensus.”
The second concept is “naïve realism,” our tendency to believe that our perception of reality is the only accurate view, and that people who disagree with us are necessarily uninformed, irrational, or biased. Naïve realism helps explain the chasm in our political discourse: instead of disagreeing with our opponents, we discredit them. It is also why some are quick to label any report that challenges their worldview as fake.
“It happens across the political spectrum,” Waytz says, pointing to the false rumor—circulated by liberals—that President Trump changed the Bill of Rights to read “citizens” instead of “persons.” “We’re all quick to believe what we’re motivated to believe, and we call too many things ‘fake news’ simply because it doesn’t support our own view of reality.”
Much of our susceptibility to fake news has to do with how our brains are wired. We like to think our political convictions correspond to a higher truth, but in fact they might be less robust and more malleable than we realize.
To some extent, says Waytz, our political beliefs are not so different from our preferences about music or food.
“There’s an assumption that fake news exacerbates polarization. But it might be the case that polarization exacerbates fake news."
In one unpublished study, Waytz and his fellow researchers presented participants with a number of statements. These included factual statements that could be proven or disproven (such as “the very first waffle cone was invented in Chicago, Illinois”), preference statements that people could assess subjectively (such as “any ice cream flavor tastes better when served in a crunchy waffle cone”), and moral–political belief statements that people could assess in terms of right or wrong (such as “it is unethical for businesses to promote sugary products to children”).
In one study, a group of participants was directly asked to read and rate statements as resembling a fact, a preference, or a moral belief. In a second study, a group of participants had their brains scanned using fMRI while reading each statement and evaluating how much they agreed or disagreed with it. After the scan, they answered questions as in the first study about whether each statement resembled a fact, a preference, or a moral belief.
Waytz and his colleagues found that, in both groups of participants, people processed the moral–political beliefs more like preferences than like facts. Not only did participants directly rate moral–political beliefs as “preference-like,” but, says Waytz, “when they read moral–political statements while having their brains scanned, the scans showed a pattern of activity that’s comparable to preferences.”
Reality by Social Consensus
Although it may seem disconcerting that our brains treat political beliefs like ice cream flavors, it also suggests that certain beliefs—like certain preferences—are susceptible to change. “We’ve all had the experience where at first we didn’t like a band but later we become a fan,” Waytz says, “and our taste for certain foods certainly evolves throughout the course of our lives.”
This is particularly true for beliefs for which public consensus is mixed. A belief like “child labor is acceptable,” against which consensus is high, is processed much like a fact. But beliefs that are more controversial, such as “dog racing is unacceptable,” are more susceptible to persuasion and attitude change and are overwhelmingly a product of the social consensus within a specific community.
This is why “fake news” is not just about social media or our tendency to skim the news—though yes, sites like Twitter and Facebook give misinformation the channels to spread at a pace we have never seen before, and roughly six in ten Americans only read headlines. Whatever the source of the news might be, the combined effects of motivated reasoning, naïve realism, and social consensus or tribalism prevent people from reaching objective conclusions.
According to Waytz, it’s also why challenging falsehoods online might be a fool’s errand.
Satisfying though it might be for people to correct the inaccuracies or outright lies posted or tweeted by political opponents—the “Bowling Green Massacre” offered one such galvanizing moment for liberals—deferring to the official record does not change the underlying social dynamic at play.
“One of the things we’re learning,” Waytz says, “is that fact-based arguments don’t always work.” Take, for instance, a 2014 study by Brendan Nyhan, a professor of political science at Dartmouth. Nyhan’s study found that even presenting parents with hard scientific evidence that vaccines do not cause autism did nothing to persuade those parents who had previously held that belief.
A Possible Solution
So, how do we overcome ideological biases and counteract the polarization that fuels “fake news?” Waytz says that social psychology also points to a way forward.
Encouragingly, there is evidence that when you alert people to their biases, they tend to succumb to them less. A study involving Israelis and Palestinians—two groups that are famously entrenched in naïve realism—demonstrated that when the concept of naïve realism was explained to them, the groups were less hostile towards each other. “When they were told, ‘Hey, this bias exists,’ even the most hawkish among them were more conciliatory,” Waytz says.
Other studies have shown that people can overcome naïve realism by legitimizing one of their opponent’s legitimate (or semi-legitimate) points.
“If a Democrat and a Republican get together, and you have each of them offer a single argument from their opponent’s side, it makes them more open to the idea that their reality is not the only one,” Waytz says. Interestingly, studies have shown that when people are given a financial incentive to reflect on views opposed to their own, they are even less biased in the judgements they make about the other side.
Waytz also points out that this level of political discord may not last forever, at least when it comes to issues impacting people’s daily lives. That’s because we are most susceptible to many cognitive biases when we process information only shallowly. “We know that people process information more deeply when there’s potential for them to lose something,” he says.
But bridging the gap between two opposing views of reality might require deeper engagement with a more diverse set of data and news sources beyond what Twitter and Facebook can offer.
“The biggest danger isn’t actually fake news—it’s tribalism,” Waytz says. “Depolarization only occurs when someone has the courage to speak out against their tribe.”