Who takes a risk on new tech?
Skip to content
Insight Unpacked Season 2, "American Healthcare and Its Web of Misaligned Incentives" | Listen Now
This website uses cookies and similar technologies to analyze and optimize site usage. By continuing to use our websites, you consent to this. For more information, please read our Privacy Statement.
The Insightful Leader Logo The Insightful Leader Sent to subscribers on May 15, 2024
Who takes a risk on new tech?

Okay, this may sound obvious, but bear with me here: companies are made of people. Which means that, when we talk about companies deciding to enter a market or launch a product or integrate AI into their workflow, what we’re really talking about is people deciding to do these things.

This makes the personal incentives of individuals—what researchers often call “career concerns”—relevant to a firm’s overall behavior.

Remember the old saying, “Nobody gets fired for buying IBM”? This week, we’ll discuss how individual appetites for risk can impact how organizations adopt new technology.

“We often have data on how companies act when it comes to technology adoption. We know that company X does this and company Y does something different,” says Kellogg’s Filippo Mezzanotti, an associate professor of finance. “But we don’t know who is making the decision within the institution, and it seems like this individual component might be quite important.”

Turning to Hollywood

Mezzanotti, along with his collaborators Grant Goehring from Boston University and S. Abraham Ravid from Yeshiva University, turned to Hollywood directors to examine how an individual can influence the adoption of new technology—in this case, the transition from film cameras to digital ones. (It was an ideal place to look for data on this topic because the decision about which camera to use is usually made by a single person: the director.)

When the researchers mapped a director‘s previous experience to camera specs for all movies that made more than $10,000 at the U.S. box office between 1997 and 2009, they found that directors who were making their first movie had a roughly 10 percent chance of using digital technology. This was twice the probability of directors with one or two prior movies, and three times the probability of directors with even more experience.

“The main result is very simple,” Mezzanotti says. “Adoption is not driven by experienced people but by those who are early in their career.”

(And it really seems to be previous experience that matters here. A director’s age and technical background, the researchers found, could also not explain the decision to go digital.)

Behind this result, the researchers believe, are career-related concerns among early directors—namely, how might they maximize their odds of making it in a very competitive field.

A special case

These findings are at odds with some existing research on the career concerns of professionals in the finance sector—likely because Hollywood is “a special case,” says Mezzanotti.

Still, the findings speak to a broader story. In any ultracompetitive profession, where the risk of failure is especially high, people who are just embarking on their career will likely be most open to risky new technologies.

The findings also hold implications for how companies, whether in competitive industries or not, ought to think about the adoption and diffusion of technology throughout their organization. If all of the people responsible for making decisions are high up and far along in their careers, then they may have certain perceptions that inhibit taking necessary risks.

“At the very least, having a mix of different people at different points in their career is a way to balance out these incentives and build toward a richer consideration of new technologies,” says Mezzanotti.

You can read more about this study in Kellogg Insight.

“It diminishes the relationship a company has with its customers.”

Alexander Chernev, in The Wall Street Journal, on the impact of hidden and excessive “junk fees.”