A Gentle Nudge Can Increase Participation in MOOCs
Skip to content
Operations Innovation Dec 7, 2015

A Gentle Nudge Can Increase Participation in MOOCs

Reminders to collaborate benefit students in massive open online courses.

Two people collaborate online.

Runeer via iStock

Based on the research of

Dennis J. Zhang

Gad Allon

Jan A. Van Mieghem

The Internet was supposed to revolutionize and democratize education. In particular, massive open online courses (MOOCs) were going to, as the New York Times wrote in 2012, “bring the best education in the world to the most remote corners of the planet.”

While MOOCs have indeed opened up courses taught by faculty from renowded institutions to students all over the world, a persistent concern is low completion rates: only about five percent of people who register for the courses actually finish them. Most people, it seems, have a hard time engaging with course material in the isolation of their own homes—particuarly when they have little financial skin in the game.

“People have started to realize that MOOCs may not do what some of the early visionaries thought,” says Jan Van Mieghem, professor of managerial economics and decision sciences at the Kellogg School. “A key concern of online education is obviously the lack of collaboration among students.”

He wondered whether a collaborative environment could be cultivated in a MOOC. And, if so, if it would help students do better in the course.

So Van Mieghem and fellow Kellogg School researchers—Gad Allon, also a professor of managerial economics and decision sciences, and Dennis J. Zhang, a doctoral student—decided to test out a few techniques in their own MOOC. Specifically, they decided to see what the impact would be of encouraging interactions via the class discussion board and one-on-one digital discussions with fellow students.

“A key concern of online education is obviously the lack of collaboration among students.”—Jan Van Mieghem

They found that these small prods to interact can have a modest impact on students’ engagement and performance.

Measuring Impact

The researchers’ MOOC—a five-week-long course on scaling operations that consisted of four weekly lectures and a weekly quiz—attracted more than 24,000 registrants, some of whom paid to take the class in return for a certificate of accomplishment and others of whom took it for free. About 4,200 students submitted at least one of the quizzes.

In their first experiment, the researchers sent a survey at the beginning of the second week to all course participants, asking for feedback on the first week’s material. Those who returned the survey were then divided into two groups, one of which recieved an email reminding them to contribute to the course’s discussion board.

The encouragement worked. The simple nudge increased visits to the board by 26.5 percent and increased posts to the board by nearly 97 percent.

And visits to the board were linked to better performance. The researchers found that each additional visit to the board in the first week increased the likelihood that a student would complete the following week’s quiz by about 3.5 percent. And, on average, each student who received the nudge visited the board four additional times a week, meaning that overall, students who received the encouragement to visit the board were, on average, 13 percent more likely to complete the following week’s quiz.

The visit did not improve quiz scores, however, and after that first week, visits to the discussion board had rapidly diminishing influence on quiz-completion rates.

Still, Van Mieghem points out that the cost—a single email—was very small. “We did a very minor kind of stimulus,” he says. “We just said to students, ‘Hey, don’t forget to go to the discussion group.” That there were effects at all, he says, is a “hopeful sign.”

In their second experiment, the researchers invited some participants to take part in online one-on-one discussions about the course material. They found that students who took them up on their offer were 10 percent more likely to complete their weekly quiz—and their quiz scores increased by 2 to 10 percent in subsequent weeks.

The upshot is that the direct communication fostered by one-on-one discussions seems to be more conducive to actually learning the material. Still, only 7 percent of students who received an invitation to take part did so. One barrier is the time and effort it takes to set up—and show up for—a digital discussion. Another is the anonymity inherent in online education: “If people don’t show up, there’s no way to penalize them,” Zhang says. “And they won’t even feel bad, because there’s no shaming factor. If they don’t show up, no one knows who they are.”

Take Aways

The researchers note that their results may not be relevant to all online educators. For one, neither manipulation had a measurable effect on one group of students—those who actually paid to take the course. Paying students, it seems, were highly motivated to take part in the discussion board and one-on-one chats, even without the researchers’ nudges to interact.

But one immediate takeaway is that, among nonpaying students, early social engagement packed the most punch.

“You really have to think hard about the incentives that will help [students] at the beginning of the class,” Zhang says, “which is not the typical approach with discussion boards.” Much current thinking, he says, is focused on ways to highlight and recommend posts that would be relevant to students. But in the early weeks of the course, there are too few posts for that strategy to be helpful.

It is a principle that also holds true for online commerce, according to Zhang. For businesses that use online surveys to request customer feedback, for example, sooner is better than later. “If you’re trying to facilitate interactions with customers, you have to engage them early, or they will lose interest.”

Because the MOOC will be offered on an ongoing basis, the researchers will continue to experiment with ways to encourage social interaction—and hopefully, course completion. They are optimistic that the sheer scale of online education will make it possible for them to refine their techniques.

Many educators running a field experiment “would be impressed if they had 30 students,” says Van Mieghem. “But we’re talking about 10,000 or more students. So the scale, and the amount of data, and the kind of experimentation we can do—it’s just very impressive, and something that cannot be done in a typical, physical channel.”

He expects that, even if online courses are not the revolutionary force some people imagined they would be, they will continue to evolve.

“The reason I got involved is that we have no idea how this is going to work out,” he says. “And my feeling was, instead of letting it come to me, I’d rather be there early and be able to influence it. There’s no doubt that, one way or another, some of this technology will stay.”

Featured Faculty

Member of the Department of Managerial Economics & Decision Sciences from 2005 to 2016

A. C. Buehler Professor; Professor of Operations

About the Writer
Theo Anderson is a writer and editor who lives in Chicago.
About the Research

Zhang, Dennis J., Gad Allon, and Jan A. Van Mieghem. 2015. “Does Social Interaction Improve Service Quality? Field Evidence from Massive Open Online Education.” (September 21). Available at SSRN.

Read the original

More in Operations