Is Reading Someone’s Emails Like Entering Their Home?
Skip to content
Innovation Policy May 5, 2016

Is Reading Someone’s Emails Like Entering Their Home?

How conceptions of privacy change over time and how analogies pave the way.

Can we keep our Internet behavior private?
Based on the research of

Kartikeya Bajpai

Klaus Weber

From data leaks to scandalous videos to the question of who gets to hack an iPhone, privacy issues have captured recent headlines. But what exactly does privacy mean in the digital age?

“There is no single definition of privacy,” says Klaus Weber, an associate professor of strategy at the Kellogg School. “The concept is more subjective and fluid than people might realize. And technology has made it incredibly difficult for regulators to pin it down.”

In recent research, Weber examines how the concept of privacy has evolved in response to rapid technological change—especially in the last forty years, as the rise of digital communications, Internet culture, and personal data has blurred traditional lines between the public and private spheres.

To trace this evolution, Weber and his coauthor, Kartikeya Bajpai, a PhD student at Kellogg, dug through forty years’ worth of archival materials—including policy agendas, newspaper records, and the Congressional database of hearings on the issue. They also conducted interviews with regulators, activists, journalists, corporate lawyers, and academics.

They found that societal views of privacy do not just naturally shift in response to the pressing issues of the day. Instead, they morph in response to a constant battle of moral values—and combatants on all sides use a surprising tool to advance their own views: analogies. Describing the latest technology in terms of one that is well established, social actors advance their interests by “normalizing” their view of where it fits on the privacy spectrum.

“Defining privacy in a changing society has always been a process of translation,” Weber says. “It’s an attempt to put the new or unfamiliar into familiar terms. Of course, these analogies aren’t value-neutral—there’s always a moral agenda at work, whether we realize it or not.”

The Power (and Limits) of Analogy

“Translating” privacy to account for technological change is not a new challenge.

In the late nineteenth century, when considering laws about intercepting confidential messages, Congress debated whether the telegraph was comparable to the postal service. Protecting the privacy of a telegram, after all, only made sense if everyone agreed that telegrams were analogous to personal letters—a view that, though it never became an official act of Congress, was eventually supported by state laws.

But the rise of electronic communications has made this analogical reasoning even more of a headache. By 1995, courts were debating whether encryption software belonged on a list of regulated munitions (alongside bombs and flamethrowers) or whether encryption was in fact a “language act” protected by the first amendment.

“Lawmakers and judges have to decide: Is a chat room equivalent to a coffee shop, or is it more like a closed debate club?”

Today, technology appears in danger of surpassing analogy altogether. Computerized databanks of the 1960s could easily be compared to paper-filing systems, but what is the appropriate analogy for the Internet, or the cloud?

“Privacy laws often depend on complex legal analogies,” Weber says. “But the analogies become more difficult as our technology becomes more complicated and we get further and further away from the most archetypical case of privacy, the sanctity of one’s home. Lawmakers and judges have to decide: Is a chat room equivalent to a coffee shop, or is it more like a closed debate club?”

Analogy as Moral Argument

Of course, as anyone who has ever been in an argument will know, the analogies we choose are never perfectly objective (though we often present them as such). Lurking beneath each is a worldview and an agenda.

When new technologies throw the definition of privacy into flux, social actors like journalists, rights activists, or Supreme Court justices use analogies to elevate one dominant moral value over another. “If you think privacy should be protected, you need some kind of justification,” Weber says. “On what grounds should you protect it? Without a moral foundation that appeals to higher values or principles, it’s hard to make the case.”

For example, in the late nineteenth century, the legal scholar Louis Brandeis appealed to human dignity in his argument in favor of privacy legislation. He warned against the harmful impact of a technology like the telegraph if—as he wrote—“what is whispered in the closet shall be proclaimed from the house-tops.”

In the 1970s, the dominant moral argument in the United States shifted from concerns about human dignity toward what Weber calls “an ideology of liberty from state intervention.”

Again, changes in technology spurred the transition. The government had begun keeping digital files, and a new FBI tactic, wiretapping, was made famous by the Watergate scandal. And once again, analogies played a critical role in influencing how these advances were perceived. If the phone could be considered a domestic object—and journalists, academics, and rights activists argued that it could—to wiretap was to violate the sanctity of one’s home. This line of reasoning eventually shaped legislation such as the Privacy Act of 1974, which sought to control the collection and use of data by federal agencies.

Protection from government intervention has in fact remained the dominant moral concern in the United States since the late 1960s. But the rise of powerful technology companies is complicating this view.

In 2004, during a congressional debate, one member of Congress used an established analogy—that listening to someone’s phone call was equivalent to entering his or her home—to argue that tech companies should not be able to track their customers’ web activity. He asked his peers how they would feel if a telecom company listened in to everyone’s phone conversations and used speech-recognition technology to identify key words, which could then be used for targeted telemarketing. Was our Internet use not analogous to our phone conversations? The Congressman’s analogy did not win out—which is why Google is allowed to “eavesdrop” on our search history.

For Weber, the evolution of privacy is the result of a dynamic process—one that exposes the moral underpinnings of our laws and institutions. And his research finds that we have now entered a new phase—one with important implications for the meaning of privacy in a digital society. In this current phase, privacy is codified as “data protection” and, in the United States, viewed increasingly as a consumer—as opposed to human—right.

The Market-Based Justification for Privacy

Most of us would agree that our personal data should be protected. Yet we might disagree over who should have the power to protect that data. Is protecting our information better left in the hands of Apple or with the U.S. government? If privacy is a human right, protection must be absolute, except in cases where the government or a court chooses to suspend that right out of public interest—for example, to prevent crime. If privacy is a consumer right, protection is an individual choice: it can be signed off by informed consent of the consumer.

“In recent years, we have seen a rise in market-based justifications for protecting individuals’ privacy,” Weber says. According to this view of privacy, personal data is no longer viewed through the lens of “human dignity”—it is an asset we can buy and sell in the marketplace as we wish. Privacy protection then is analogous to other consumer rights, such as food labeling or product liability.

In one sense, the market-based justification reflects the efforts of companies like Apple and Google, which—at least in the early days—preferred as little privacy regulation as possible.

“They see privacy not as a right, but as a product feature,” Weber says.

And although tech companies have become more engaged in privacy debates since the Snowden leaks, they have done so in a way that protects their own business interests—that is, they prefer to categorize privacy as a consumer right (not a human right) and to use market-based justifications for why it should be protected. When the U.S. government ordered Apple to comply with the FBI’s request to access an iPhone used by Syed Rizwan Farook in the San Bernardino shooting, Apple appealed on the grounds that creating a customized operating system for the government was too heavy a financial burden.

For Weber, this is all the more reason to study how the concept of privacy is constructed in the digital age. “It’s important to know what we mean by privacy if our personal information is going to power the economy.”

Featured Faculty

Thomas G. Ayers Chair in Energy Resource Management; Professor of Management and Organizations

About the Writer
Drew Calvert is a freelance writer based in Iowa City, Iowa.
About the Research

Bajpai, Kartikeya and Klaus Weber. Forthcoming. “Privacy in Public: Translating the category of privacy to the digital age.” Bajpai, Kartikeya;  and Weber, Klaus. forthcoming. “Privacy in public: Translating the category of privacy to the digital age.” In Rodolphe Durand, Nina Granquist, and Anna Tyllström (eds.) From Categories to Categorization. (Research in the Sociology or Organizations, Vol 47). Emerald Group Publishing.