Can Wikipedia Be Trusted?
Skip to content
Strategy Economics Jun 2, 2015

Can Wikipedia Be Trusted?

Crowdsourced Wikipedia entries are more biased than Encyclopaedia Britannica articles.

Researchers asked, can Wikipedia be trusted?

Yevgenia Nayberg

Based on the research of

Shane Greenstein

Feng Zhu

Crowdsourcing.

It has become a household word, with crowdfunded products, businesses, art projects, nonprofits, and even new journalism models. Citizen scientists band together virtually to find new galaxies, count birds, and monitor water quality.

But the granddaddy of crowdsourced information sites is Wikipedia, which—somewhat ironically—can tell you that the Encyclopaedia Britannica, its predecessor and former go-to source for information, published its last print edition in 2012.

Wikipedia’s popularity as an information source with everyone from grade-schoolers to those in their golden years, led Shane Greenstein, a professor of strategy at the Kellogg School, to investigate how faithfully Wikipedia adheres to a “neutral point of view.” Newer articles, he has found, are less biased than ones crafted earlier in the site’s existence. But how does Wikipedia stack up against encyclopedias? Is the wisdom of the crowd more biased than the wisdom of experts?

Our increasing reliance on crowdsourced content makes it important to understand how it differs from the traditional variety, written and edited by experts and professionals. “Not all information is alike,” Greenstein says. “As we’ve moved online extraordinarily rapidly, the primary sources of information have shifted from books to online sources, and there’s this presumption that somehow the content stayed the same.”

Greenstein’s research, conducted with Harvard coauthor Feng Zhu, suggests that Wikipedia entries are slightly more biased politically than their Britannica counterparts, and that this holds truer for longer entries and those with fewer contributors.

Bias and Slant

“As sources that aspire to provide comprehensive information, Britannica and Wikipedia face similar conflicts over the length, tone, and factual basis of controversial, unverifiable, and subjective content,” Greenstein and Zhu write in their study. Understandably, these conflicts are “pervasive” when it comes to current events and other politically charged topics.

But Britannica and Wikipedia address this problem in distinct ways. The encyclopedia uses a small group of experts and editors who engage in a back-and-forth dialogue before settling on what to publish for a given entry. Wikipedia goes much larger scale, depending on the virtual crowd—tens of millions of people—to generate entry information. And compared to its traditional counterpart, Wikipedia has a dramatically decentralized process for dealing with editing and conflicts. Does this allow more bias to creep into its entries?

Greenstein compared nearly 4,000 pairs of entries on American politics published in Encyclopaedia Britannica and on Wikipedia. Specifically, the researchers developed an index that measured slant—an indication of whether an article leaned left or right, politically—and bias, the degree of measured slant.

“Extreme slant attracts revision from the opposite extreme. Stuff in the middle doesn’t.”

Check out more from The Trust Project at Northwestern University here.

To do so, he and Zhu utilized a method detailed in previous research by University of Chicago economists. This method capitalizes on the fact that some phrases are more likely to be used by Democrats—“war in Iraq,” “civil rights,” and “trade deficit,” for instance—while other phrases like “economic growth,” “illegal immigration,” and “border security,”—are more likely to be used by Republicans.

More Eyeballs, Less Bias

Greenstein and Zhu measured both the direction and strength of each article’s opinion. They found that overall, Wikipedia’s political articles are more likely to lean “mildly” Democratic—or to slant left—than Britannica’s, and that the extent of the bias in Wikipedia entries is greater.

But the most interesting finding, at least for Greenstein, was that Wikipedia articles with more revision in them had less bias and were less likely to lean Democratic. As Greenstein and Zhu write, “the largest biases and slants arise on Wikipedia articles with fewer contributions.”

This is largely consistent with Linus’s Law, a principle applied to software development that claims that “given enough eyeballs, all bugs are shallow.” An argument for making software code openly available, it means that with a sufficient number of users poring over a given piece of code—even a large one—all problems will be found and fixed. But in the case of Wikipedia, it turns out, sometimes there simply are not enough eyeballs. “Extreme slant attracts revision from the opposite extreme,” Greenstein says. “Stuff in the middle doesn’t.”

This may also help explain why Wikipedia articles have a small bias overall, as articles with greater slant attract more readers and, ultimately, revisers, leading to a reduction in bias. Still, Greenstein is interested in “pulling apart Linus’s Law into its micro-components.” It’s something, he says, that “the crowdsource community takes on faith. The social scientists in me and Feng view it as a hypothesis.”

What portion of the Wikipedia articles studied received sufficient revision to come close to Britannica in terms of objectivity? “It’s a small fraction, ten to twenty percent at best,” Greenstein says. “The good news is, those are the popular articles. The bad news is, there’s a lot of text in there that’s still pretty biased.”

Greenstein and Zhu also found that longer articles were more likely to be biased than shorter ones. And Wikipedia’s articles tend to be longer than Britannica’s—partly for the obvious reason that online content faces none of the financial constraints print media does in regard to length. The greater availability of space enables Wikipedia’s writers to add material at will. And that means more opportunity to hear from all of the biased points of view.

Write Local, Revise Global?

It is important to remember that Wikipedia positions itself as a jumping-off point rather than a definitive source. Britannica, on the other hand, presented itself as the ultimate authority. But many people do take Wikipedia as definitive. And that, says Greenstein, is why it is important to tease out its problems, including bias, without detracting from its “marvelous achievement.”

“It’s naïve to think you can leave the text alone,” he says. “It takes a lot of revision, and there’s a lot of text that doesn’t get a lot of attention. You have to spread the attention around.” Thus, the researcher’s findings make clear that managers of crowdsourced content, whether on Wikipedia or elsewhere, need to allocate limited editorial attention as strategically as possible.

Greenstein, meanwhile, is now looking at geographic factors related to Wikipedia. “We were curious after this paper to take a look,” he says. Perhaps not surprisingly, his preliminary findings suggest the sources of revision for entries span the globe. Editing is not done by a single set of editors, or even editors living in one dominant place. “It’s an extraordinarily distributed group,” Greenstein says. Eyeballs, it seems, are everywhere.

Featured Faculty

Member of the Strategy Department faculty until 2015

About the Writer
Hillary Rosner is a freelance journalist based in Boulder, Colorado.
About the Research

Greenstein, Shane, and Feng Zhu. “Do Experts or Collective Intelligence Write with More Bias? Evidence from Encyclopædia Britannica and Wikipedia.” Harvard Business School Working Paper No. 15-023. October 10, 2014.

Read the original

Add Insight to your inbox.
This website uses cookies and similar technologies to analyze and optimize site usage. By continuing to use our websites, you consent to this. For more information, please read our Privacy Statement.
close-thin