Podcast: Why Do So Many People Distrust the News?
Skip to content
Marketing Aug 14, 2017

Podcast: Why Do So Many People Distrust the News?

Plus, how to avoid being duped by fake news yourself.

Wrecking ball destroys news.

Yevgenia Nayberg

Based on the research and insights of

David Rapp

Rachel Davis Mersey

Kent Grayson

Listening: Why Do So Many People Distrust the News?
download
0:00 Skip back button Play Skip forward button 16:31

From network broadcasts to the Oval Office to your Facebook feed, it seems that the term “fake news” has exploded in use over the past year. But while deceptive or manipulative reportage is nothing new, the use of the term to mean everything from outright howlers to “news that I don’t agree with” is certainly noteworthy. So, why has fake news become such a thing?

In this month’s Insight podcast, we invite back last month’s guests to take a look at a different phenomenon of our current media culture: fake news. David Rapp, a professor of psychology at Northwestern, describes just what makes fake news so “sticky.” Rachel Davis Mersey, an associate professor of journalism at Northwestern’s Medill School, discusses what steps reporters can take to increase trust in the news they report. And Kent Grayson, an associate professor of marketing at the Kellogg School, talks about how the distrust of the media is an outgrowth of a larger distrust in institutions—and what each of us can do to leave our “media bubbles.”

Podcast Transcript

[music intro]

Jessica LOVE: Fake news. It’s a concept—and a hashtag—that’s exploded over the past year.

[montage of news clips about fake news]

“Fake news” means different things to different people. To some, it’s the spread of purposely erroneous stories meant to dupe people. For others, including some politicians, it’s a label used to denigrate media coverage they don’t agree with. And in yet other contexts, “fake news” is shorthand for a much broader anxiety about facts and truth: who gets to decide what’s true, and what happens if we can’t, or don’t want to, agree?

We’re not going to tackle all aspects of all of these permutations of “fake news”—that’s too big a task for one podcast—but we will take on a few in this episode of the Kellogg Insight podcast. Stay with us.

[Music Interlude]

I’m your host, Jessica Love.

The idea to discuss fake news actually grew out of interviews we conducted for last month’s podcast. In that episode, we talked with three professors in different disciplines about why the mind craves lists as an organizing principle.

And during those interviews, the question of how we know when to trust new information came up time and time again. So this month we bring all three professors back. And we ask them: Why is fake news such a cultural concern today?

If you think the Internet is to blame, you’re not wrong. But you’re not completely right.

Here’s David Rapp, a professor of psychology at Northwestern.

David RAPP: People have always been worried about being presented with inaccurate information. I think the difference between now and previous eras is that there’s just way more of it, and it’s way easier to access.

LOVE: In other words, Facebook might make it easier for false stories about Donald Trump winning the popular vote, or Donald Trump rewriting the Bill of Rights, to reach a really wide audience. But false narratives have been around as long as humans have.

So why are we so drawn to them?

David Rapp says there’s an incredibly simple reason why false stories spread: it’s really easy to get people to state inaccurate things.

For example, he’s found that if you ask people what the capital of Russia is, almost all of them will give the right answer: Moscow.

But let’s say you have the same people read a story in which a character says that the capital of Russia is St. Petersburg. Afterward, 20 to 30 percent of them will state that the capital is indeed St. Petersburg.

RAPP: And what’s really interesting is, some other labs have demonstrated that if you ask them later how they learned about it, they say they knew it beforehand. So they’ll tell you St. Petersburg is the capital of Russia and that they knew it before they read this text, which is impossible because they’ve never actually learned that anywhere.

LOVE: Then there’s the repetition issue. If we hear something over and over, we remember it better—even when that something is patently untrue, and we know it is patently untrue. It seems to be human nature to assume that if we can remember something really well, it must be accurate.

RAPP: So, for example, when particular politicians will say things that may or may not be true, news programs will repeat those things. They might say explicitly, “These aren’t true,” but they’ll repeat them—and repeating them on their own might be a problem.

LOVE: It’s not just the media that’s guilty of this. Think about the last time you came across a piece of fake news so wrong it was funny—like the recent hoax about great white sharks in the Mississippi River.

If you’re like Rapp, you might have felt the urge to share that particular knee-slapper with your friends.

RAPP: I’ll see something pretty ridiculous I read in the news, and I’ll post it on Facebook because I think it’s funny and I want people to see it. But if people read that and post it to other people who aren’t critically evaluative or see it over and over again, there can be problematic consequences.

LOVE: Meaning that repeating fake news—even if we explicitly point out that it’s fake—can ultimately spread it to people who don’t register that it’s fake, either because they’ve seen it again and again in their Facebook feeds, or because they simply want it to be true.

Of course, we don’t all stumble around, constantly swallowing whatever half-baked story we come across. So what are the conditions that lead us to question the accuracy of what we hear?

In one of Rapp’s studies, he has two people work together to memorize a list of, say, birds. After the pair has studied the list for a while, they’re asked to take turns remembering the items on that list out loud, one at a time, like this:

STUDY PARTICIPANT 1: Pigeon.

STUDY PARTICIPANT 2: Eagle.

STUDY PARTICIPANT 1: Raven.

STUDY PARTICIPANT 2: Owl.

LOVE: But there’s a catch. One of the participants has been secretly coached ahead of time to occasionally say an item that wasn’t on the list.

RAPP: So they might say, “chicken,” which never actually was in the list.

LOVE: Then, the true study participant—the one not feeding false answers—is instructed to write down everything they remember seeing in the original list.

RAPP: And they’ll accidentally report the incorrect thing that their partner said even though they never saw it before, and they’ll report that they actually saw that item.

LOVE: But this effect disappears if the participant thinks that the other person isn’t a reliable source.

If the guy who says “chicken” seems hesitant and uncertain about what he remembers, you’re not going to repeat the false claim that “chicken” was on the list of words you memorized together.

But in our day-to-day lives, we don’t always get these kinds of clues. Behind a slick website, everyone can seem equally confident. What we don’t know, Rapp says, is how people determine whether a news outlet is trustworthy.

RAPP: I think there needs to be more work looking at what happens when people are sitting at their computers and making decisions, what leads them to decide, “I should do a little extra work to figure out if this source is reliable or not.”

LOVE: In the meantime, he says there’s another way to prevent people from repeating inaccurate information: Ask them to actively correct it.

RAPP: We’ve shown in our lab that if you ask people when they’re reading to actually proofread the information and make corrections to the content, they’re less likely to use the inaccurate information.

LOVE: In other words, when we’re put into a situation where we need to critically evaluate what we read, it makes a difference.

There’s just one problem. When’s the last time you printed out an article from the Internet and reviewed it with a red pencil in your hand?

[music interlude]

LOVE: Now let’s switch to a different definition of “fake news,” the one people use when they distrust a particular media outlet. What can reputable journalists do to convince their audience that they are not, in fact, fake news?

Rachel Davis MERSEY: One of the great things about being a reporter is, you know a lot. You know a lot about context. You’ve thought about things deeply. Especially the best political reporters really can think through consequences in a way that the average citizen, maybe, doesn’t have the time or luxury to do.

LOVE: That’s Rachel Davis Mersey, an associate professor of journalism at Northwestern’s Medill School.

She says that journalists are, indeed, well positioned to provide accurate information. But at a time when people are more concerned than ever about the spread of misinformation, the traditional ways of reporting and presenting news may need to be revisited.

MERSEY: I’m really imploring the news business to think about transparency, to think about how the news is gathered, to talk more with audiences about how the news is gathered, how the news is reported, and to engage in more thoughtful storytelling about all the aspects of a narrative.

LOVE: One specific place to start: the use of anonymous sources. And let’s be clear: anonymous sources can serve a really important role, providing information that journalists can’t get any other way—think of what Deepthroat did to help break the Watergate story. But, they still don’t inspire a lot of confidence.

MERSEY: The audience seems very uncomfortable with anonymous sourcing, yet we see great news organizations rampantly using it, and sort of using it almost flippantly. The great example they used was, “President Trump retreats to the private residence and puts on a bathrobe, but he’s all alone.” Well, then, how do you know he’s in a bathrobe? How do you validate anonymous sources? How does that make someone feel confident about the story? And then how does that build trust? The truth is, it doesn’t.

LOVE: Audience trust can also take a hit when a media outlet makes an error. Often, it’s not so much about the error itself but the way the outlet handles it that does the most damage.

MERSEY: Mistakes will happen and mistakes need to be corrected, and they need to be corrected in a bold way, not secretly on the second page of the newspaper. People need to accept responsibility for the mistakes.

LOVE: But even when a news outlet does have a reputation for being pretty transparent about its sources or its mistakes, it’s not always easy for audiences to tell whether the information they’re getting is actually from that outlet.

Take “imposter” or “spoof” sites—which are made to look like established news sites, like CNN or ABC. They might have similar names, colors, or logos. They may even have similar URLs. But they’re actually run by someone else entirely. This can be really confusing.

Even some common practices by legitimate organizations can leave audiences uncertain who is behind the information they find. Like hosting “sponsored content” where advertisements resemble actual articles.

MERSEY: Just because it’s in The New York Times doesn’t mean it was written by The New York Times, so it could be advertising that blends in. It could be a promotion for something else, an affiliated product The New York Times is selling.

LOVE: Mersey says that sponsored content really can fool people. She points to a UPS infographic that appeared on Fast Company’s website. The ad presents UPS as a solution to the supply-chain challenges facing companies. But except for the word “advertisement” printed in tiny font at the top of the page, the infographic looks pretty darned indistinguishable from the rest of the magazine’s content.

Once people realize they’ve been fooled by ads like these, Mersey says, their confidence in their ability to tell the difference between news and advertising—as well as their confidence in the media—can be genuinely shaken.

MERSEY: It does require you to be a much more informed consumer of the news than it ever did before.

[music interlude]

Kent GRAYSON: If you look at any study that’s been tracking trust in institutions—and there are quite a few good ones—trust in institutions continues to decline.

LOVE: That’s Kent Grayson. Grayson is an associate professor of marketing at the Kellogg School and the director of The Trust Project, an initiative that brings together different perspectives on trust.

He says that it’s not just trust in the media that’s declining. Confidence in all kinds of things—the government, universities, organizations—has been going down for a while.

And regardless of whether a lack of confidence in any institution is deserved or not, Grayson thinks it can become something of a self-fulfilling prophesy.

The more skeptical people are about an institution like the media, the more closely they look at it—and the more they find other reasons for skepticism.

GRAYSON: There could be very much a vicious cycle happening right now. Where institutions work best when they are good, and therefore they are invisible.

LOVE: When we fully trust our government or universities or media outlets, we don’t really “see” them. They’re invisible. We don’t think critically about how they operate.

But once the trust starts to go, the scrutiny begins. We call for more transparency.

And sometimes this does the trick. We certainly want our institutions to be able to withstand scrutiny.

But often, we start to see more and more things we don’t like. Or what one person views as a well-functioning system, another person views as broken and untrustworthy.

GRAYSON: Trust in institutions is declining much more precipitously amongst people who are less educated, less affluent, and who are essentially the losers in this income gap that we’re seeing. One reason that’s potentially problematic is that those who are making the decisions about the products to sell and the regulations to make don’t think there’s a problem with trust in institutions as acutely as those who are left out. And so decisions are being made that might not be addressed at fixing the problem and which may even actually be making the problem even worse, without realizing it.

LOVE: One way to mitigate this gap, he says, is for people to escape their bias bubbles.

How might this happen? By using tools like Escape Your Bubble, a plug-in that inserts opposing political views into your Facebook feed, or Read Across the Aisle, an app that tracks the political leanings of what you read and encourages you to explore articles whose points of view you might not agree with.

He’s optimistic about what could happen.

GRAYSON: Some people live in the blue bubble; some people live in the red bubble. And the idea is that I’m sitting in my red bubble, and I believe that all the institutions that are providing me with the information that I’m getting in my red bubble are trustworthy. Same in the blue bubble—I believe that all the institutions that are providing me with information are legitimate and they’re trustworthy. I believe that as people start to see that there are mutually incompatible things happening in these two bubbles, and as they start to realize that actually some of the information I’m getting in my bubble is biased or isn’t right or isn’t true, they’ll start to understand that trust in their institutional guarantors is maybe overblown.

LOVE: Once enough people burst their own bias bubbles, Grayson hopes some of the most polarizing consequences of this “fake news” phenomenon will subside.

GRAYSON: It’s really important to be reading more of what’s happening in the other bubble than in the bubble that I already believe. Why do I need to be reassured about that? I already kind of know what I believe. So I’ve changed my reading habits.

Because otherwise we are always going to be in a world where 40 percent of the country that thinks that whatever is happening in government is the worst thing in the world, and another 40 percent thinks it’s awesome, and neither one of those things can be true all the time.

[music interlude]

LOVE: This program was produced by Jessica Love, Fred Schmalz, Emily Stone, and Michael Spikes. It was written by Anne Ford.

Special thanks to our guests, Kent Grayson, Rachel Davis Mersey, and David Rapp.

You can stream or download our monthly podcast from iTunes, Google Play, or our website, where you can read more about trust and human behavior.

You can also read another article about fake news, featuring Kellogg professor Adam Waytz, on our site.

Visit us at insight.kellogg.northwestern.edu. We’ll be back next month with another Kellogg Insight podcast.

Featured Faculty

Associate Professor of Marketing; Bernice and Leonard Lavin Professorship