Sparring with AI
Skip to content
Insight Unpacked Season 3: Can We Still Build a Green Economy? | Listen
Insight in your inbox
Receive our newsletters to keep up with the latest research and ideas from faculty at the Kellogg School of Management.
This website uses cookies and similar technologies to analyze and optimize site usage. By continuing to use our websites, you consent to this. For more information, please read our Privacy Statement.
The Insightful Leader Logo The Insightful Leader Sent to subscribers on May 6, 2026
Sparring with AI

As a writer, I can’t help but feel a little threatened by AI. Use a model to slick up a slide show or automate budget tracking? Sure, that’s something I’ll willingly outsource to the bots. But when it comes to a creative process like writing, I still prefer to do it the old-fashioned way.

But this week, Kellogg’s Brayden King makes a strong case for how scholars can use AI in their writing process—and it applies to other forms of communication as well. And we share research about another role where AI offers both an existential crisis and an opportunity: apprenticeships.

A sparring partner

By now, the objections to writing with AI are familiar. Because of the data used to train AI, what it writes can often come uncomfortably close to plagiarism. The output of AI models can be unreliable, sometimes outright false. There are ethical arguments and environmental concerns.

But as King, a professor of management and organizations, writes in The Chronicle of Higher Education, those debates obscure how academics might use AI to improve their writing processes. King himself uses Anthropic’s Claude model as a thought partner, feeding it voice memos, incomplete documents, and loose ideas, then asking it to find patterns and provide reflections. 

It’s an interaction that doesn’t negate King’s experience and expertise, but builds on them instead. 

“Claude doesn’t hand me conclusions. It helps me find them based on my own knowledge base,” King writes. “Scholarly writing requires knowing things, like who your audience is, which prior contribution your work departs from, and which potential objections you cannot ignore.”

King compares the current debates about AI in academics to early skepticism about whether junior scholars should write online about work in progress. At its best, blogging served as “a thinking space with an audience,” King writes, where researchers could get instant feedback from knowledgeable peers. 

Working through a half-formed idea with AI can provide a similar experience, he observes.

“Where the blog gave you other minds to test your thinking against, AI gives you a sparring partner and collaborator with a seemingly limitless capacity to engage.”

Like the word processor and the internet before it, the technology, King predicts, will change the practice of scholarship and communication—but not the need for human expertise.

“The question is never whether tools change the writing. They always do,” King concludes. “The question for us is how to use the tool to get more out of our effort. Answering that honestly requires knowing your subject and caring about getting it right.”

Read more in The Chronicle of Higher Education.

The end of apprentices?

There’s a common piece of advice about agentic AI: treat it like a team of interns. But if you assign all your busywork to AI agents, what does that leave for the human interns? 

New research coauthored by Luis Rayo, Erwin P. Nemmers Professor of Strategy, models how increasing the use of AI may change the role of apprenticeships in modern work. 

Traditionally, an entry-level apprentice receives training and experience—and a low salary—in exchange for handling menial work. Now that AI can perform those tasks for even cheaper, this centuries-old dynamic could crumble.

“Apprenticeship-like systems are a very common way in which economies solve the difficult problem of transferring human capital from one generation to the next,” Rayo says. “AI risks destroying that system, and the problem then is, who’s going to train the next generation of experts?”  

Indeed, the researchers found that introducing AI raises the floor for what apprentices need to do to justify their expense. But for advanced apprentices, the technology also raises the ceiling of their capabilities, making them more productive and valuable contributors earlier in their career.

“The question is, which one is growing faster—the floor or the ceiling?” Rayo says.    

If the gap between which tasks AI can do and what advanced apprentices can accomplish shrinks enough, not just for a single apprenticeship but across the workforce, society could face serious consequences.   

“Human knowledge would start disappearing, and then the robots are just going to do whatever they’re able to do on their own, without the added benefit of advanced human knowledge,” Rayo says. “It’s going to be a case where, instead of accumulating knowledge as a society, we start losing knowledge.” 

Read more in Kellogg Insight.

“Research suggests that young people have greater optimism, creativity, and motivation to drive social change than other age groups.”

Allison Henry, with David Finegold, in Crain’s Chicago Business, on why nonprofit organizations should consider younger candidates for their boards.

© Kellogg School of Management, Northwestern
University. All Rights Reserved. Privacy Policy.