Could is the key word here, because it’s still too early to know how this will unfold. Still, we have places to turn for clues. After all, this isn’t the first time tasks performed by humans have been taken over by machines.
“One way to think about AI is as continuing the process of automation but targeting mental or cognitive tasks that humans do as opposed to physical tasks,” says Benjamin Jones, a professor of strategy at Kellogg.
Drawing from two of his recent studies on automation, Jones lays out a few different scenarios for how AI could transform the economy. He points to two specific factors that will likely determine just how much of an impact AI ultimately has—and how the average worker stands to fare.
Factor 1: Whether AI will target bottlenecks in the economy
The first of these factors is where in the economy the advances happen. If AI addresses areas that are currently experiencing productivity bottlenecks, then its impact on the economy will be vastly greater than if it does not.
Compare, for example, historic advances in agriculture, where there are fewer bottlenecks, to those in computing, where there are many.
Hundreds of years ago, “you and I would both be farmers, almost surely, because almost everyone was a farmer,” says Jones. Despite a lot of hard work spent preparing the land, cutting down trees, preparing the furrows for planting, watering, fertilizing, and harvesting, most farms produced just enough to feed the families working them, with perhaps a bit left over to sell at market if they were lucky.
In advanced economies today, agriculture is highly automated—and highly productive. There are machines that level the land and irrigate, machines that plant seeds in perfect rows, and machines that apply the exact amount of fertilizer necessary. A single modern combine harvester can process over a hundred acres a day. “We produce more food than ever, yet almost nobody’s a farmer” because of modern automation, says Jones. Recent estimates suggest, for instance, that only about 1.3 percent of American workers are employed on a farm.
In some sense, these machines were incredible job-killers—but because modern farming is so much more productive, food became much easier for everyone to come by. “And then what happened was that we freed labor to not have to be farmers, so people could migrate to other jobs,” he says. “There’s a process of the economy churning, destroying a lot of jobs, then creating new kinds of jobs, and people reallocating to new roles in a more productive economy, in which people are on average considerably richer and live longer.”
Now fast forward to another technological advancement: computing. Before IBM introduced its mainframe computer in 1952, human “computers” did the exacting work of number-crunching. Since then, of course, the machines have taken over, to spectacular effect. Moore’s Law, which states that the number of transistors on a microchip doubles about every two years or so, has proven true for nearly 80 years and has led to life-changing innovations from the internet to smartphones to apps like Zoom. “You can have a video call in the back of a taxi with a family member who is 6,000 miles away. That would’ve sounded like a fantasy to people in 1980,” says Jones.
But since Moore made his famous prediction, something strange has happened: “Productivity growth has been unusually slow,” says Jones. Despite jaw-dropping advances in computing power, the economy has not become a lot more productive, nor have living standards dramatically increased.
To make sense of this contradiction, says Jones, “you have to realize something very important about how economies grow: it’s what we’re bad at that really matters.” That is, productivity isn’t dependent on how efficient the economy is at its best, but at its worst.
Many tasks in agriculture are rote and highly repetitive—which is to say, easy to automate. This has meant that the industry was able to dramatically increase its productivity without running into too many bottlenecks: its slowest, “worst” processes were still pretty darn fast.
On the other hand, while computing power has increased exponentially, it is often in service of more cognitive, custom tasks, such as legal services. These have historically been harder to automate, leaving lots of bottlenecks. After all, it doesn’t matter how fast your computer can run if every output has to be written or checked manually.
And it’s not just professional services that are full of bottlenecks. Consider the amount of time and effort it takes to travel across the country, or cook a meal, or visit with your doctor, or generate a unit of electricity. These things really haven’t changed that much in fifty years because, for some reason or another, there are bottlenecks.
But bottlenecks don’t just slow down growth in a given sector: they have an outsized effect on the entire economy. As a task automates and becomes more productive, it shrinks in terms of its share of the economy. That’s because the output becomes ubiquitous—and inexpensive. (Consider that agricultural output across all of America’s farms is now just 0.7 percent of GDP.) Meanwhile, it’s the least productive tasks—the bottlenecks—whose share of the economy increases over time. This means that, eventually, the bulk of the economy is devoted to sectors that aren’t increasing in productivity at all. This is where most advanced economies find themselves today.
How much AI will transform our economy, then, depends on the extent to which it can increase productivity in these unproductive and expensive parts of the economy, like healthcare, education, hospitality, transportation, or electricity. Autonomous vehicles or robots that could do household chores, for instance, could dramatically free up human labor, like advances in farming machinery did for farmers; similarly, using AI to make individual physicians or educators considerably more productive could also help to relieve these bottlenecks.
For this sort of automation to really change the game, AI “can’t just be the next smartphone,” says Jones. After all, we got the smartphone, which did improve certain cognitive tasks, like navigating a city or retrieving information, but didn’t move the needle on productivity enough. Rather, AI “has to go beyond that if it’s going to really bend the curve of productivity.”
There is some reason to believe this is possible. “In the sense that AI is a cognitive-oriented technology targeted towards a lot of services, I think it’s targeted toward the mass of the economy where the bottlenecks are,” he says.
Factor 2: Whether AI will perform cognitive tasks a lot better than humans
The second factor that will determine AI’s impact on the economy is whether AI will replace human labor by being just barely better than us, or dramatically better.
If AI is good enough to replace humans, but not much better than that (a scenario that is, at least in the immediate term, plausible), this is very bad news for human labor.
Consider an automated checkout kiosk at a grocery store or a bot that provides customer service to airline passengers. Few would claim that these technologies are vastly superior to their human counterparts. Often, they are a little bit worse than human cashiers or agents. But they are cheaper, and businesses are likely to implement the lowest-cost solution that customers and clients will tolerate. In this scenario—where AI replaces human labor without dramatically increasing productivity—labor gets a lower share of the income and we don’t see dramatic gains in living standards.
This leaves many workers worse off while not providing much advantage to the overall economy.
But there’s another, more optimistic scenario here. If AI truly does prove transformative—for example, by allowing a single radiologist to do the work of 15 radiologists, and a single coder to do the work of 15 coders, and so on—then we can expect an explosion of economic growth that will allow all of us to enjoy a higher standard of living (even if a few radiologists and coders need to find other jobs). This will be true even if machines take over the vast majority of jobs, so long as there are at least some tasks for which human labor is required.
To understand why, recall how the share of the economy devoted to bottlenecks will always continue to grow, while the tasks that can be easily automated will shrink. That means that, as automation takes over more tasks, the remaining non-automated tasks will increase in importance, and humans will be better compensated for doing them. And in this “explosive growth” scenario, the economy will be expanding so quickly that these tasks will pay handsomely indeed.
Consider that today’s cellists are no more productive than cellists in the 17th century. They play the same music, even the same instruments, and demand for their labor has only decreased as other forms of entertainment have emerged. “So why are they getting paid like 20 times higher in real terms?” says Jones. The answer is that “in some sense, the cellist is the bottleneck.” That is, the cellist’s productivity hasn’t increased, but because so many other parts of the economy have increased (and thus the relative cost of so many other things has declined), the cellist gets to enjoy a higher standard of living.
That’s why Jones is hoping for a “world where humans do a very small share of the tasks, but they do something, and computers get infinitely good at everything else,” says Jones. “If we get super productive with machines, it’s like we’re all cellists.”