The Surprising Speed with Which We Become Polarized Online
Skip to content
Organizations Data Analytics Apr 6, 2017

The Surprising Speed with Which We Become Polarized Online

Users isolate themselves in social media echo chambers, even when they start out looking at a variety of posts.

Social media echo chambers form surprisingly fast.

Morgan Ramberg

Based on the research of

Alessandro Bessi

Fabiana Zollo

Michela Del Vicario

Michelangelo Puliga

Antonio Scala

Guido Caldarelli

Brian Uzzi

Walter Quattrociocchi

Bias bubbles. Belief filters. Echo chambers. Whatever you call the ideological silos fostered by social media, everyone from Quartz to The Guardian is implicating them in the intensely divisive American political climate.

The algorithms social-media sites deploy to deliver personalized content clearly have a large role in making sure users encounter only information that agrees with their existing beliefs. But what part does a user’s own behavior play in the formation of online echo chambers? After all, an algorithm may ensure that you see posts you agree with, but it is your own decision to like a comment from Aunt Millie, or share a video from Cousin Bob.

Maybe 12 million social-media users can help us find the answer.

With the aid of a massive data set, Kellogg professor of management and organizations Brian Uzzi, and a team of colleagues in Italy, recently set out to explore human behavior within politically polarized online environments. He and his colleagues found that users rarely stay neutral, with the vast majority of users drawn to polarized content remarkably quickly.

Uzzi hopes that these findings can help us begin to understand and dismantle the echo-chamber effect.

“Social media has a tremendous effect on the minds of the masses and a tremendous responsibility to help educate us,” he says. “Social media makes fake news real news—and has squarely helped to bring us into an era of truthlessness and ‘alternative facts.’”

Science Versus Conspiracy

Uzzi teamed up with colleagues affiliated with the IMT School for Advanced Studies Lucca (Alessandro Bessi, Fabiana Zollo, Michela Del Vicario, Michelangelo Puliga, Antonio Scala, Guido Caldarelli, and Walter Quattrociocchi) to examine the behavior of 12 million Facebook and YouTube users.

Specifically, the research team looked at the “likes,” shares, and comments garnered by particular videos that were hosted on Youtube but also embedded on 413 different Facebook pages.

“Even people who start out holding two points of view at the same time can very quickly go into an echo chamber.” 

First, the researchers divided videos into two categories. They did this by analyzing the Facebook pages on which the videos were embedded.

Videos that appeared on Facebook pages like Illuminati Mind Control, Exposing Satanic World Government, and Doctors Are Dangerous—pages that served up controversial information without supporting evidence—were put into a “Conspiracy” category. Videos that appeared on pages like Scientific American Magazine, the National Science Foundation, or NASA—pages that served up scientific knowledge and rational thinking—were put into a “Science” category.

The researchers then compared user consumption patterns of these videos on both Facebook and YouTube. They examined the posts and their associated user interactions from January 2010 to December 2014. This allowed them to track—using only publically available information—how a given user’s interactions with the two types of content changed over time.

On both platforms, some users only interacted with Conspiracy content, and some only with Science content. Meanwhile, other users started out interacting with both Conspiracy and Science content—but rapidly switched to interacting only with one or the other.

Regardless of how they started, nearly all of the users became highly polarized. The researchers defined this as happening when more than 95 percent of their comments, shares, and “likes” were for a single category of content. In fact, 93.6 percent on Facebook users and 87.8 percent of YouTube viewers fell into this category.

Surprising Speed

However, it was the speed with which users became polarized that Uzzi found most astonishing.

For the users who started out interacting with both Conspiracy and Science videos, “we thought they would continue to look at a variety of content. But it turned out they quickly became very selective,” Uzzi says. How quickly? By roughly the time they made their fiftieth comment, share, or “like.”

“Even people who start out holding two points of view at the same time can very quickly go into an echo chamber,” he says.

That means, he points out, that changes in the media content people consume can affect user beliefs with surprising rapidity. And platforms like Facebook and YouTube are constantly adjusting what users see based on what those users have clicked on in the past. “The time horizons can be very quick,” he says. “Adjusting content in a particular way can lead people to make flash judgments and then hold onto those judgments, even when there is alternative information out there.”

Once in an echo chamber, users tend to stay there. This holds true across platforms for both users polarized to Science and users polarized to Conspiracy. “In terms of information and beliefs, we become old dogs very quickly, and once an old dog, we don’t learn new tricks,” Uzzi says.

Additionally, Uzzi says other research has shown that, once polarized, users tend to become even more polarized. “Inside an echo chamber, the thing that makes people’s thinking evolve is the even more extreme point of view,” he says. “So you become even more left-wing or even more right-wing, because the loudest voice is the thing that drives people in a certain direction.”

Debunking Doesn’t Work

So ideological echo chambers exist, and people tend to go into them quickly and stay there. How might they be persuaded to come out? Particularly those currently in echo chambers that perpetuate false and misleading information?

The answer seems simple: debunk the false information. But polarized social-media users do not seem particularly interested in engaging with content that might contradict their beliefs. And when they do interact with debunking posts, it can backfire, causing them to cling to their positions even more fiercely.

For example, an earlier study by some of the same Italian researchers examined the commenting behavior of more than 9 million Facebook users who were polarized to conspiracy-type thinking and found that only about 100,000 of them had commented on a debunking post. And those who did interact with the debunking content tended to react by upping their activity on conspiracy-related Facebook pages.

The Efforts to Dismantle Echo Chambers

Several attempts to help social-media users escape their echo chambers have launched recently. Those include FlipFeed, a plug-in that allows users to replace their own Twitter feeds with those of random, ideologically different strangers, and Escape Your Bubble, a plug-in that inserts opposing political views into users’ Facebook newsfeeds.

While the jury is out on the efficacy of those tools, it is quaint to imagine the days when people expected the Internet to discourage, rather than encourage, echo chambers.

“When the Internet first hit, lots of observers and scientists thought that all this new information—suddenly made just a click away—was going to make everybody smarter,” Uzzi says. “The idea was that since people wouldn’t have to go to a library and dig through old magazines to get information, they would explore the validity of other people’s points of view. They would do fact-checking, they would listen to both sides of the argument, and so they would get to the truth about the facts more quickly.”

But that has remained, as Uzzi calls it, a “big blue-sky hope.”

Featured Faculty

Richard L. Thomas Professor of Leadership and Organizational Change; Co-Director, Northwestern Institute on Complex Systems (NICO); Professor of Industrial Engineering and Management Sciences, McCormick School (Courtesy); Professor of Sociology, Weinberg College (Courtesy)

About the Writer
Anne Ford is a writer in Evanston, Illinois.
About the Research
Bessi, Alessandro, Fabiana Zollo, Michela Del Vicario, Michelangelo Puliga, Antonio Scala, Guido Caldarelli, Brian Uzzi, and Walter Quanttrociocchi. 2016. “Users Polarization on Facebook and YouTube.” PLoS ONE. 11(8): e0159641.

Read the original

Most Popular This Week
  1. Sitting Near a High-Performer Can Make You Better at Your Job
    “Spillover” from certain coworkers can boost our productivity—or jeopardize our employment.
    The spillover effect in offices impacts workers in close physical proximity.
  2. 5 Tips for Growing as a Leader without Burning Yourself Out
    A leadership coach and former CEO on how to take a holistic approach to your career.
    father picking up kids from school
  3. How Are Black–White Biracial People Perceived in Terms of Race?
    Understanding the answer—and why black and white Americans may percieve biracial people differently—is increasingly important in a multiracial society.
    How are biracial people perceived in terms of race
  4. 2 Factors Will Determine How Much AI Transforms Our Economy
    They’ll also dictate how workers stand to fare.
    robot waiter serves couple in restaurant
  5. Podcast: How to Discuss Poor Performance with Your Employee
    Giving negative feedback is not easy, but such critiques can be meaningful for both parties if you use the right roadmap. Get advice on this episode of The Insightful Leader.
  6. What Should Leaders Make of the Latest AI?
    As ChatGPT flaunts its creative capabilities, two experts discuss the promise and pitfalls of our coexistence with machines.
    person working on computer next to computer working at a computer
  7. Today’s Gig Workers Are Subject to Endless Experimentation
    “It raises the question, do we want to be a society where experimentation is just the norm?”
    gig worker at computer with three scientists studying them through a window
  8. Will AI Eventually Replace Doctors?
    Maybe not entirely. But the doctor–patient relationship is likely to change dramatically.
    doctors offices in small nodules
  9. How to Make Inclusivity More Than Just an Office Buzzword
    Tips for turning good intentions into actions.
    A group of coworkers sit in various chairs.
  10. China’s Youth Unemployment Problem
    If the record-breaking joblessness persists, as seems likely, China will have an even harder time supporting its rapidly aging population.
    college graduate standing before Chinese flag
  11. Will AI Kill Human Creativity?
    What Fake Drake tells us about what’s ahead.
    Rockstars await a job interview.
  12. Why Are We So Quick to Borrow When the Value of Our Home Rises?
    The reason isn’t as simple as just feeling wealthier.
    A homeowner uses the value of their home to buy things.
  13. Take 5: Research-Backed Tips for Scheduling Your Day
    Kellogg faculty offer ideas for working smarter and not harder.
    A to-do list with easy and hard tasks
  14. Why Do Some People Succeed after Failing, While Others Continue to Flounder?
    A new study dispels some of the mystery behind success after failure.
    Scientists build a staircase from paper
  15. How to Manage a Disengaged Employee—and Get Them Excited about Work Again
    Don’t give up on checked-out team members. Try these strategies instead.
    CEO cheering on team with pom-poms
  16. Which Form of Government Is Best?
    Democracies may not outlast dictatorships, but they adapt better.
    Is democracy the best form of government?
  17. The Second-Mover Advantage
    A primer on how late-entering companies can compete with pioneers.
  18. What Happens to Worker Productivity after a Minimum Wage Increase?
    A pay raise boosts productivity for some—but the impact on the bottom line is more complicated.
    employees unload pallets from a truck using hand carts
More in Organizations