The Surprising Speed with Which We Become Polarized Online
Skip to content
Organizations Data Analytics Apr 6, 2017

The Surprising Speed with Which We Become Polarized Online

Users isolate themselves in social media echo chambers, even when they start out looking at a variety of posts.

Social media echo chambers form surprisingly fast.

Morgan Ramberg

Based on the research of

Alessandro Bessi

Fabiana Zollo

Michela Del Vicario

Michelangelo Puliga

Antonio Scala

Guido Caldarelli

Brian Uzzi

Walter Quattrociocchi

Bias bubbles. Belief filters. Echo chambers. Whatever you call the ideological silos fostered by social media, everyone from Quartz to The Guardian is implicating them in the intensely divisive American political climate.

The algorithms social-media sites deploy to deliver personalized content clearly have a large role in making sure users encounter only information that agrees with their existing beliefs. But what part does a user’s own behavior play in the formation of online echo chambers? After all, an algorithm may ensure that you see posts you agree with, but it is your own decision to like a comment from Aunt Millie, or share a video from Cousin Bob.

Maybe 12 million social-media users can help us find the answer.

With the aid of a massive data set, Kellogg professor of management and organizations Brian Uzzi, and a team of colleagues in Italy, recently set out to explore human behavior within politically polarized online environments. He and his colleagues found that users rarely stay neutral, with the vast majority of users drawn to polarized content remarkably quickly.

Uzzi hopes that these findings can help us begin to understand and dismantle the echo-chamber effect.

“Social media has a tremendous effect on the minds of the masses and a tremendous responsibility to help educate us,” he says. “Social media makes fake news real news—and has squarely helped to bring us into an era of truthlessness and ‘alternative facts.’”

Science Versus Conspiracy

Uzzi teamed up with colleagues affiliated with the IMT School for Advanced Studies Lucca (Alessandro Bessi, Fabiana Zollo, Michela Del Vicario, Michelangelo Puliga, Antonio Scala, Guido Caldarelli, and Walter Quattrociocchi) to examine the behavior of 12 million Facebook and YouTube users.

Specifically, the research team looked at the “likes,” shares, and comments garnered by particular videos that were hosted on Youtube but also embedded on 413 different Facebook pages.

“Even people who start out holding two points of view at the same time can very quickly go into an echo chamber.” 

First, the researchers divided videos into two categories. They did this by analyzing the Facebook pages on which the videos were embedded.

Videos that appeared on Facebook pages like Illuminati Mind Control, Exposing Satanic World Government, and Doctors Are Dangerous—pages that served up controversial information without supporting evidence—were put into a “Conspiracy” category. Videos that appeared on pages like Scientific American Magazine, the National Science Foundation, or NASA—pages that served up scientific knowledge and rational thinking—were put into a “Science” category.

The researchers then compared user consumption patterns of these videos on both Facebook and YouTube. They examined the posts and their associated user interactions from January 2010 to December 2014. This allowed them to track—using only publically available information—how a given user’s interactions with the two types of content changed over time.

On both platforms, some users only interacted with Conspiracy content, and some only with Science content. Meanwhile, other users started out interacting with both Conspiracy and Science content—but rapidly switched to interacting only with one or the other.

Regardless of how they started, nearly all of the users became highly polarized. The researchers defined this as happening when more than 95 percent of their comments, shares, and “likes” were for a single category of content. In fact, 93.6 percent on Facebook users and 87.8 percent of YouTube viewers fell into this category.

Surprising Speed

However, it was the speed with which users became polarized that Uzzi found most astonishing.

For the users who started out interacting with both Conspiracy and Science videos, “we thought they would continue to look at a variety of content. But it turned out they quickly became very selective,” Uzzi says. How quickly? By roughly the time they made their fiftieth comment, share, or “like.”

“Even people who start out holding two points of view at the same time can very quickly go into an echo chamber,” he says.

That means, he points out, that changes in the media content people consume can affect user beliefs with surprising rapidity. And platforms like Facebook and YouTube are constantly adjusting what users see based on what those users have clicked on in the past. “The time horizons can be very quick,” he says. “Adjusting content in a particular way can lead people to make flash judgments and then hold onto those judgments, even when there is alternative information out there.”

Once in an echo chamber, users tend to stay there. This holds true across platforms for both users polarized to Science and users polarized to Conspiracy. “In terms of information and beliefs, we become old dogs very quickly, and once an old dog, we don’t learn new tricks,” Uzzi says.

Additionally, Uzzi says other research has shown that, once polarized, users tend to become even more polarized. “Inside an echo chamber, the thing that makes people’s thinking evolve is the even more extreme point of view,” he says. “So you become even more left-wing or even more right-wing, because the loudest voice is the thing that drives people in a certain direction.”

Debunking Doesn’t Work

So ideological echo chambers exist, and people tend to go into them quickly and stay there. How might they be persuaded to come out? Particularly those currently in echo chambers that perpetuate false and misleading information?

The answer seems simple: debunk the false information. But polarized social-media users do not seem particularly interested in engaging with content that might contradict their beliefs. And when they do interact with debunking posts, it can backfire, causing them to cling to their positions even more fiercely.

For example, an earlier study by some of the same Italian researchers examined the commenting behavior of more than 9 million Facebook users who were polarized to conspiracy-type thinking and found that only about 100,000 of them had commented on a debunking post. And those who did interact with the debunking content tended to react by upping their activity on conspiracy-related Facebook pages.

The Efforts to Dismantle Echo Chambers

Several attempts to help social-media users escape their echo chambers have launched recently. Those include FlipFeed, a plug-in that allows users to replace their own Twitter feeds with those of random, ideologically different strangers, and Escape Your Bubble, a plug-in that inserts opposing political views into users’ Facebook newsfeeds.

While the jury is out on the efficacy of those tools, it is quaint to imagine the days when people expected the Internet to discourage, rather than encourage, echo chambers.

“When the Internet first hit, lots of observers and scientists thought that all this new information—suddenly made just a click away—was going to make everybody smarter,” Uzzi says. “The idea was that since people wouldn’t have to go to a library and dig through old magazines to get information, they would explore the validity of other people’s points of view. They would do fact-checking, they would listen to both sides of the argument, and so they would get to the truth about the facts more quickly.”

But that has remained, as Uzzi calls it, a “big blue-sky hope.”

Featured Faculty

Richard L. Thomas Professor of Leadership and Organizational Change; Professor of Management and Organizations

About the Writer
Anne Ford is a writer in Evanston, Illinois.
About the Research
Bessi, Alessandro, Fabiana Zollo, Michela Del Vicario, Michelangelo Puliga, Antonio Scala, Guido Caldarelli, Brian Uzzi, and Walter Quanttrociocchi. 2016. “Users Polarization on Facebook and YouTube.” PLoS ONE. 11(8): e0159641.

Read the original

Add Insight to your inbox.
This website uses cookies and similar technologies to analyze and optimize site usage. By continuing to use our websites, you consent to this. For more information, please read our Privacy Statement.
More in Business Insights Organizations