AI and the Social Sciences Used to Talk More. Now They’ve Drifted Apart.
Skip to content
Innovation Social Impact Jul 1, 2019

AI and the Social Sciences Used to Talk More. Now They’ve Drifted Apart.

Research shows that the gap between these disciplines is growing, which could make it harder to address social and ethical problems.

AI researchers and scholars of the humanities and social sciences sit at different tables, reflecting the growing gap between their disciplines.

Michael Meier

Based on the research of

Morgan R. Frank

Dashun Wang

Manuel Cebrian

Iyad Rahwan

Artificial intelligence researchers are employing machine learning algorithms to aid tasks as diverse as driving cars, diagnosing medical conditions, and screening job candidates. These applications raise a number of complex new social and ethical issues.

Add Insight
to your inbox.

So, in light of these developments, how should social scientists think differently about people, the economy, and society? And how should the engineers who write these algorithms handle the social and ethical dilemmas their creations pose?

“These are the kinds of questions you can’t answer with just the technical solutions,” says Dashun Wang, an associate professor of management and organizations at Kellogg. “These are fundamentally interdisciplinary issues.”

Indeed, economists seeking to predict how automation will impact the labor market need to understand which skills machines are best suited to perform. At the same time, the engineers writing software to diagnose tumors may want to know what philosophers have to say about the moral conundrums their technology poses. And coders and psychologists will need to work together to ensure that algorithms in recruiting software do not amplify human biases.

Some researchers have managed to cross departmental barriers. For example, a groundbreaking study last year explored how millions of people across the globe would make the difficult decisions that autonomous vehicles face (e.g., given a choice between killing a pedestrian or a passenger, whose life would they favor?). The researchers aim to use this work to ensure that new technologies reflect universal values.

Yet a new paper by Wang and collaborators finds that the link between AI and the social sciences (and other fields) has weakened over time.

The researchers analyzed several decades of papers published in the field of AI, as well as those in the social sciences, humanities, natural sciences, engineering, and medicine. They found that, more and more, computer scientists are facing social questions on their own, without relying deeply on insights from scholars who study them. At the same time, scholars of the social sciences, physical sciences, and humanities seem to be losing touch with rapid advances in AI as well.

Taken together, the results speak to a renewed need for researchers to collaborate across disciplines, Wang says.

“Just when AI is becoming more and more relevant to any corner of society, it’s becoming more and more isolated,” Wang says. “We really need to close that gap.”

Artificial Intelligence in Society Raises New Issues

People have been grappling with the social and philosophical consequences of technology for centuries, Wang points out. Take, for example, the 1823 novel Frankenstein. “AI was born from this kind of fascination,” he says. “It had very deep roots in social sciences.”

More recently, AI researchers have begun to face the real-life quandaries that the technology introduces.

Consider, for instance, when Amazon attempted to develop machine learning tools to score job candidates. Because the software used data on past applicants to predict which people were best-suited for the company, a glaring problem emerged: since many previous applicants were men, the program penalized candidates whose resumes contained the word “women’s” or listed certain all-women’s colleges as an alma mater.

Wherever bias already exists, AI “is just going to magnify that bias,” Wang says. (The Amazon program has since been discontinued.)

“Just when AI is becoming more and more relevant to any corner of society, it’s becoming more and more isolated.”

Wang wanted to know how often AI researchers were engaging with disciplines such as psychology, philosophy, economics, and political science, which could help them address these inevitable ethical and social issues. One measure of this engagement is whether AI researchers are citing other disciplines in their academic papers. To investigate, Wang collaborated with Morgan Frank, Manuel Cebrian, and Iyad Rahwan of the Massachusetts Institute of Technology.

The team took advantage of a newly available dataset from Microsoft Academic Graph (MAG), which indexes scholarly papers. The data included traditional journal citations as well as conference proceedings, a major venue for AI findings. And it captured citation relationships between papers—that is, whenever one study referenced another.

The AI Clique

Wang and his collaborators examined MAG data from 1950 to 2018. They found that the number of publications in AI and related subfields (such as computer vision and natural language processing) rose exponentially during that time, from hundreds to tens of thousands of papers per year. These fields now dominate computer science research.

To quantify the interactions between AI and other disciplines, the team developed a measure that captured how frequently papers in one field cited another field, controlling for the total number of papers published in the second field.

First, the team looked at how AI papers cited other academic fields. They found that in the 1960s, AI researchers cited psychology papers more than five times as frequently as would be expected if they had instead chosen papers to cite randomly out of a hat. Today, however, they cite psychology papers less than half that often.

Similarly, dramatic drops occurred in citations of philosophy, economics, and art. Not surprisingly, today’s AI papers cite computer science and math the most heavily.

Next, the researchers considered the reverse problem: How often did other disciplines cite AI papers, controlling for the growing number of AI publications each year? Here, they found that fields such as psychology, philosophy, business, political science, sociology, and economics have all become less inclined to draw on AI research. For example, psychologists in the 1960s cited AI papers about four times as much as would be expected by chance. Today, however, they cite AI less frequently than if they were selecting papers to cite entirely at random.

The overall conclusion: “AI has become more and more cliquish,” Wang says.

One possible explanation is that it has simply become harder for social scientists to keep up with rapid advances in increasingly complex AI research.

Furthermore, the swell of interest in AI could, paradoxically, help explain its isolation. Some AI conferences are so in demand that social scientists may have trouble getting in, according to Wang’s coauthor Morgan Frank. In a blog post, Frank noted that one popular meeting “sold out of registration spots in under 15 minutes, thus making attendance difficult for active AI researchers—let alone interested scientists from other fields.”

Who’s Doing AI Research?

Another factor is the shift in who dominates AI research today.

The researchers examined which institutions were publishing the most “central” AI papers—those papers that were most frequently cited by other highly cited papers.

While schools such as MIT, Stanford, and Carnegie Mellon once served as powerhouses for the most central AI research, the researchers found that today these papers were increasingly likely to come out of private companies like Google and Microsoft.

That may be because these firms have the resources to acquire expensive infrastructure. “This is not a cheap sport,” Wang says. “You need to have a whole stack of graphics processing units and computational power and storage.”

“It goes both ways. AI has to pay more attention to social science. Social scientists have to pay more attention to AI.”

That might help explain the growing disconnect between AI and the social sciences. In their study, Wang and collaborators found that researchers in sociology, philosophy, political science, business, and economics are less likely to cite publications produced by companies than those from academia. As such, the concentration of AI research in private industry could be contributing to the weakened relationship with the social sciences.

And once they get a head start, big companies are more likely to continue producing a disproportionate share of the research—a “rich-get-richer phenomenon,” Wang says. Industry teams develop better systems, attract more users, and generate more data, which can then be used to train their systems to become even more accurate. “It’s a self-reinforcing mechanism.”

Bridging the AI Divide

Despite the rapid growth of AI, Wang fears that new technology will fall short of its full potential if it does not better incorporate insights from social science and other fields.

To bridge this gap, Wang recommends that universities encourage more collaborations between AI and other departments. For example, Northwestern University has started a program called CS+X, which connects computer scientists with researchers in fields such as medicine, journalism, law, and economics.

Some existing research hints at how AI developers can effectively integrate findings from other fields. For instance, the study exploring how self-driving cars can better reflect human morality (coauthored by Wang’s collaborator Iyad Rahwan, an AI scholar) drew upon research from psychology, moral philosophy, economics, and even science fiction.

However, the fact remains that such wide-ranging bibliographies are relatively rare.

And just as computer scientists need to consult experts outside of their discipline, Wang says, social scientists can no longer afford to ignore developments in AI. As machines reshape how we work, think, and make decisions, he argues, it is becoming more crucial than ever that economists, philosophers, and psychologists stay abreast of the latest developments in computer science, and vice versa.

“It goes both ways,” Wang says. “AI has to pay more attention to social science. Social scientists have to pay more attention to AI.”

Featured Faculty

Professor of Management & Organizations; Professor of Industrial Engineering & Management Sciences (Courtesy)

About the Writer
Roberta Kwok is a freelance science writer based near Seattle.
About the Research
Frank, Morgan R., Dashun Wang, Manuel Cebrian, and Iyad Rahwan. 2019. “The Evolution of Citation Graphs in Artificial Intelligence Research. Nature Machine Intelligence 1: 79–85.
Most Popular This Week
  1. What Happens to Worker Productivity after a Minimum Wage Increase?
    A pay raise boosts productivity for some—but the impact on the bottom line is more complicated.
    employees unload pallets from a truck using hand carts
  2. How to Get the Ear of Your CEO—And What to Say When You Have It
    Every interaction with the top boss is an audition for senior leadership.
    employee presents to CEO in elevator
  3. 6 Takeaways on Inflation and the Economy Right Now
    Are we headed into a recession? Kellogg’s Sergio Rebelo breaks down the latest trends.
    inflatable dollar sign tied down with mountains in background
  4. Which Form of Government Is Best?
    Democracies may not outlast dictatorships, but they adapt better.
    Is democracy the best form of government?
  5. When Do Open Borders Make Economic Sense?
    A new study provides a window into the logic behind various immigration policies.
    How immigration affects the economy depends on taxation and worker skills.
  6. How Has Marketing Changed over the Past Half-Century?
    Phil Kotler’s groundbreaking textbook came out 55 years ago. Sixteen editions later, he and coauthor Alexander Chernev discuss how big data, social media, and purpose-driven branding are moving the field forward.
    people in 1967 and 2022 react to advertising
  7. How Offering a Product for Free Can Backfire
    It seems counterintuitive, but there are times customers would rather pay a small amount than get something for free.
    people in grocery store aisle choosing cheap over free option of same product.
  8. Why Do Some People Succeed after Failing, While Others Continue to Flounder?
    A new study dispels some of the mystery behind success after failure.
    Scientists build a staircase from paper
  9. How Much Do Boycotts Affect a Company’s Bottom Line?
    There’s often an opposing camp pushing for a “buycott” to support the company. New research shows which group has more sway.
    grocery store aisle where two groups of people protest. One group is boycotting, while the other is buycotting
  10. 5 Takeaways on the State of ESG Investing
    ESG investing is hot. But what does it actually deliver for society and for shareholders?
    watering can pouring over windmills
  11. Could Bringing Your "Whole Self" to Work Curb Unethical Behavior?
    Organizations would be wise to help employees avoid compartmentalizing their personal and professional identities.
    A star employee brings her whole self to work.
  12. How Are Black–White Biracial People Perceived in Terms of Race?
    Understanding the answer—and why black and white Americans may percieve biracial people differently—is increasingly important in a multiracial society.
    How are biracial people perceived in terms of race
  13. What Went Wrong at AIG?
    Unpacking the insurance giant's collapse during the 2008 financial crisis.
    What went wrong during the AIG financial crisis?
  14. Why Well-Meaning NGOs Sometimes Do More Harm than Good
    Studies of aid groups in Ghana and Uganda show why it’s so important to coordinate with local governments and institutions.
    To succeed, foreign aid and health programs need buy-in and coordination with local partners.
  15. 3 Tips for Reinventing Your Career After a Layoff
    It’s crucial to reassess what you want to be doing instead of jumping at the first opportunity.
    woman standing confidently
  16. Immigrants to the U.S. Create More Jobs than They Take
    A new study finds that immigrants are far more likely to found companies—both large and small—than native-born Americans.
    Immigrant CEO welcomes new hires
More in Innovation