Featured Faculty
Clinical Associate Professor of Development Economics; Director of Research Methods Cluster in the Global Poverty Research Lab
Yevgenia Nayberg
The U.S. has grappled with a shortage of coronavirus tests since the pandemic began, preventing many people from knowing with certainty whether they’ve contracted COVID-19.
This testing shortage is likely to have dire consequences for public health. But it could also deepen the toll on businesses at a time when political officials and business leaders are working to stem the virus’s economic impact.
Recent research from Andrew Dillon, clinical associate professor of development economics within Kellogg’s Public–Private Interface Initiative (KPPI), suggests that testing employees for an infectious disease can have a surprising impact on their productivity.
Dillon, along with coauthors Ashesh Prasann and Jed Friedman of the World Bank, Pieter Serneels at the University of East Anglia, and Oladele Akogun at the Modibbo Adama University of Technology, explored how medical testing and treatment affects job performance when people are particularly concerned about contracting an infectious disease.
They focused on malaria. The team launched a malaria testing and treatment program for agricultural workers in Nigeria, and then monitored these employees’ earnings, productivity, and physical activity in the weeks after people were informed of their test results and received necessary treatment.
The researchers found that access to malaria testing and treatment led to significant increases in both earnings and days worked, but not solely for workers who tested positive. Interestingly, earnings and productivity also jumped for employees who tested negative for malaria. After learning that they were disease-free, these employees became more physically active on the job and devoted more of their work hours to higher-effort—and higher-paying—tasks.
The finding offers an important lesson for any business facing the threat of infectious disease, whether malaria or COVID-19. “When people don’t know their status, they can’t take the appropriate curative or protective measures for themselves,” Dillon says. “That potentially has productivity implications for firms, workers, and the global economy.”
Dillon and colleagues undertook a series of experiments at a large Nigerian sugarcane farm to observe the small-scale economic impacts of readily accessible malaria testing.
The economic benefits of treating deadly diseases was clear: Prior research had found that countries like Greece, Spain, Italy, and Jamaica enjoyed swift economic growth after eradicating malaria, for example. Another study concluded that in economies where malaria was widespread, income in 1995 was only 33 percent that of otherwise similar countries where malaria was absent.
But the researchers suspected that simply testing for a disease might also deliver an economic boost, not just by helping those who were sick to recover, but also by improving productivity for those who test negative.
“When people don’t know their status, they can’t take the appropriate curative or protective measures for themselves.”
Malaria, which is transmitted through mosquito bites, is widespread in Nigeria. The World Health Organization estimates that 100 percent of the country’s population faces significant risk of infection, and the disease kills roughly 100,000 Nigerians per year (a full 25 percent of the disease’s global death toll).
Early symptoms of malaria resemble an intense flu, with fever, chills, headaches, and nausea being the most common. However, it can take over a week for symptoms to appear, and people who are frequently exposed to the disease sometimes acquire “partial immunity,” which can weaken the severity of symptoms. As such, people may keep working even after being infected. And, unlike many other infectious diseases, malaria cannot be transmitted from person to person, meaning that employees who choose to keep working are not putting others at risk.
Roughly 800 sugarcane cutters (all of them men) worked on the farm at the time of the studies. Each morning, employees chose one of two possible tasks for the day: cutting sugarcane or “scrabbling”—that is, collecting cut sugarcane rods and readying them for processing by rolling them into bundles.
Employees who cut sugarcane were paid per sliced rod, with the average worker earning about 1,008 Naira in a day. Scrabbling is the less physically demanding of the two options and it also pays less, offering a fixed rate of 500 Naira per day, about half of what sugarcane cutters earn.
All employees on the farm had access to the malaria testing and treatment program. Over the course of six weeks, researchers invited workers to have a sample of their blood drawn and analyzed. They received their results about three days later. The 36 percent who tested positive for malaria were given Artemisinin-based Combination Therapy (ACT), which kills the disease-causing parasite within seven days (though it can take several weeks for the previously infected to fully regain their strength).
The researchers offered the six-week treatment program twice over two subsequent harvest seasons, using two slightly different approaches to answer their central question.
In the first season, they drew from records of how many days each employee worked, which task they chose, their total earnings, and how much work they were able to do per day. However, the researchers realized that such measurements were not always available (for instance, when work is done collaboratively among several people). Without such data, it would be difficult for other researchers to test whether their results could generalize to other contexts.
So they decided to test whether their results held up when they used an alternative productivity measure: physical activity. So in the second harvest season, they ran a second study at the same farm, this time giving a random subset of 83 employees FitBits to wear.
The researchers found that testing for malaria and treating those who were infected benefited all employees, boosting overall earnings and days worked by about 10 percent. The reason was twofold.
First, employees who tested positive and received treatment boosted their earnings by simply feeling healthier and coming to work more often. Among this group, the number of days worked per week after treatment increased by 7 percent.
Second, those who tested malaria-negative also had higher productivity; after learning that they were healthy, these employees took on more strenuous, higher-paid work and generally upped their physical activity on the job.
In short, the results show a direct link between a negative malaria diagnosis and how much physical effort a worker puts forth. “That’s something we could only establish by measuring workers’ physical activity,” Dillon says.
“It’s certainly within a firm’s interest to make these types of investments because it can reduce supervision and turnover in their labor forces.”
The researchers’ analysis also established that, beyond the public-health benefits, the financial benefits of the medical program outweighed the costs. During the three weeks post-testing (or post-treatment, for those who had tested positive), the program increased earnings by more than $13 per employee, while the cost of implementing the program was just $10 per employee.
Less easily quantifiable, Dillon adds, is the worker goodwill that redounds to a company when it provides access to an important health intervention—not to mention the many other upsides that accrue for individuals, families, and communities when hundreds of people are treated for malaria.
The experiments also didn’t capture the long-term benefits to the farm. But Dillon says the results suggest that implementing testing and treatment could help a company reduce its expenses. Because sick employees could recover, for example, the farm would likely have to spend less on training replacements going forward. Plus, they might be able to cut back on worker supervision, since a clean bill of health alone motivated people to work harder.
One critical takeaway from the study: business leaders overseeing employees in contexts where an infectious disease is rampant—nearly every part of the world today—stand to benefit from ensuring that their employees are clear on their own health status.
“When workers are in endemic situations or face high risks as part of their jobs,” Dillon says, “resolving informational uncertainty can help them become more productive and devote more of their effort to work.”
Given the manifold gains that flow from testing and treatment programs—for individuals themselves, for businesses, for society more broadly—who should pay for them? Dillon makes the case that, in the absence of robust public-health programs, companies should consider the costs a sound investment.
“Certainly the public-health system should be the first stop for workers in terms of managing their health,” Dillon says. “But for very physical workers especially, we know that there’s this link between being healthy and being more productive on the job. It’s certainly within a firm’s interest to make these types of investments because it can reduce supervision and turnover in their labor forces.”
At the same time, employees also have plenty to gain from the implementation of these programs—probably more than the firms themselves. Does that mean they should be on the hook for covering the cost of tests? Dillon believes the best answer may be a mix of both, perhaps resembling typical employer-sponsored health-insurance programs in the U.S., where employers and employees jointly contribute to the total cost of employees’ healthcare.
“I think the issue comes down to one of coordination,” he says. “Can firms providing workplace-based health services reduce the costs and better nudge workers to have health information and access to treatment? And can those costs be shared between firms and workers?”