When they set out to gain insights into the general public’s attitudes, preferences, and beliefs, designers of consumer surveys must take into account several potential sources of bias. Some of the most common include survey takers’ dishonesty about their behavior, misinterpretation of questions, and lack of effort in providing accurate answers to survey questions.

Two Kellogg School of Management researchers have discovered a new and previously unrecognized type of bias among survey respondents. This bias stems from respondents’ frustration when surveys ignore issues that they regard as critically important. However, designers of surveys can rest relatively easy in light of the new finding. The Kellogg pair has also identified an easy and cost-effective way of overcoming the bias: giving respondents the opportunity to express their concerns.

Venting Strong Opinions
Response substitution, as the newly identified bias is called, is the process whereby some respondents frame their answers to questions in a survey in such a way as to express their views on issues outside the survey’s scope—issues on which they have strong opinions. “People don’t answer the question they’re asked. Instead, they supply an opinion they want to share,” explains Derek Rucker, an associate professor of marketing at the Kellogg School. “The idea that respondents don’t give accurate answers is intuitive,” adds David Gal, an assistant professor of marketing at the Kellogg School. “But the idea that they would volitionally choose to answer questions differently is a very novel insight—a new source of bias that we’re the first to demonstrate.”

Imagine a diner who, after suffering poor service in a restaurant, is asked to complete a survey that focuses only on the quality of the food. Even though she enjoyed the meal, the diner might give it a negative review in order to voice her general disapproval of the service. “She is not responding simply to the question whether the food is good but perceives the question also as an opportunity to express her negative attitude toward the restaurant due to the service,” write Gal and Rucker.

The concept of answers to unasked questions has obvious implications for interpreters of consumer surveys and organizations that rely on those interpretations. “[R]esearchers and marketing practitioners might misconstrue the meaning of respondents’ answers to their questions if respondents attempt to express attitudes or beliefs that the researchers have not asked about,” Gal and Rucker write.

Three Experiments
Once Gal and Rucker imagined the concept of response substitution, they devised means of testing whether it reflected reality and determining ways to compensate for it in the design of surveys. The process involved three experiments.

The first experiment asked volunteers to rate the intelligence and wastefulness of two hostesses who treated their friends to dinner. Anne entertained at home, having bought a $250 fondue set that she never used again; Jane spent the same amount at a fondue restaurant. The study participants reacted as previous research had suggested they would, perceiving only Anne as wasteful. They also rated Anne as less intelligent when the survey asked them to rank her intelligence before her wastefulness. But if they were asked to rate wastefulness before intelligence, they regarded her as more intelligent because they had already given their opinions on her wastefulness. “This pattern of data is consistent with response substitution because wastefulness led to more negative perceptions of intelligence when participants did not have an opportunity to provide their attitude toward wastefulness,” Gal and Rucker write.

The next experiment asked a different set of volunteers to evaluate the quality of a snack bar (based on pictures of the product) manufactured by two companies, one plainly moral and the other immoral. All were asked how important they regarded a company’s moral behavior when deciding whether or not to buy its products. Some subjects were also told that they could add any open-ended thoughts or comments to their answers at the end of the survey. Volunteers who did not have the chance to express open-ended thoughts used the question about product quality to express their attitudes about the company’s morality, Gal and Rucker report. And those who regarded the company’s moral behavior as important in their purchasing decisions were more likely to engage in such response substitution.

The third experiment asked yet another group of volunteers to imagine receiving excellent service and food at a restaurant that had recently been upgraded. They were then asked to complete a survey on the new décor and lighting. Again, some participants were given the chance to add extra comments, while others had no such opportunity. In this case, the two researchers write, “participants who believed there was a high likelihood that their answer about décor and lighting might convey their attitude toward the restaurant provided a more favorable rating when they did not know they would have an opportunity to self-express than when they did.”

Occurrence, Mechanism, and Avoidance
Taken together, the three experiments not only confirm that response substitution occurs, but also provide some understanding of its mechanism. “The first experiment demonstrates the effect,” Rucker explains. “The second experiment shows the significance of a topic is important; when consumers have an important opinion, they want to convey it. And the third experiment shows that you can reduce response substitution by telling people they’ll have an option to state any other opinions they have later.”

That third result, Gal points out, “provides an easy solution for survey designers who want to avoid response substitution: Tell the consumers that, after the survey is over, they’ll have the opportunity to express any opinions they might have. They’ll probably be more satisfied with the survey and give you more valid data.”

Lessons for Survey Users
The research carries other lessons for designers and interpreters of consumer surveys. “When answers look inconsistent or curious, you might want to ask whether the survey takers are trying to express something else,” Rucker warns. “If, say, they dine frequently but aren’t too interested in the food, they might have another reason for their low rating. So if you see inconsistencies or oddities you might look at response substitution as a possibility.”

“The main message is that, when completing surveys for companies or researchers, people may answer in terms of what they want to share,” he continues. “So you not only want to ask the right questions. You want to make sure the consumer is answering the right questions.”

Gal adds an intriguing thought that can stimulate further research on the issue. “One interesting question,” he says, “is that survey takers might start to believe what they say. Saying they don’t like a restaurant’s food even if they do might convince them that they actually don’t like it.”

Related reading on Kellogg Insight

Surveying Sensitive Topics: New tools help correct for survey bias

Desire to Acquire: Powerlessness and compensatory consumption

Be Good, Get Mad: Exerting self-control makes people more inclined to anger