Why would shoppers prefer chatbots to humans? New study pinpoints a key factor

(Photo credit: Adobe Stock)

Technological advancements are revolutionizing customer service interactions, with firms increasingly relying on chatbots—automated virtual agents that can simulate human conversation. While many people generally prefer interacting with human customer service agents, a new study reveals an interesting twist: when consumers feel embarrassed about their purchases, they actually prefer dealing with chatbots. The study was published in theJournal of Consumer Psychology.

The primary aim of the study was to understand how consumers’ concerns about self-presentation—essentially, their worries about being judged by others—affect their interactions with chatbots compared to human customer service agents. Lead researcher Jianna Jin, an assistant professor at the University of Notre Dame, and her colleagues wanted to explore whether chatbots could mitigate feelings of embarrassment in online shopping scenarios. This inquiry was particularly relevant as chatbots, with their ambiguous or disclosed identities, become more prevalent in the digital marketplace.

The researchers conducted a series of five studies to understand consumer preferences when dealing with chatbots versus human agents in contexts likely to elicit embarrassment. Participants were recruited from Amazon Mechanical Turk and other platforms.

Study 1 involved 403 participants who were asked to imagine buying a personal lubricant from an online store. They interacted with an ambiguous chat agent, meaning the agent’s identity as either human or chatbot was not disclosed. The participants then had to infer the agent’s identity and complete a measure of self-presentation concerns related to sex-related topics.

The results showed that participants with higher self-presentation concerns were more likely to infer that the ambiguous chat agent was human. This finding suggested that in situations where people felt anxious about how they were perceived, they tended to err on the side of caution, assuming the agent might be human to prepare themselves for potential embarrassment.

Study 2 expanded on these findings by comparing reactions to different product categories. Here, 795 female participants imagined purchasing either a personal lubricant or body lotion from an online store and interacted with the same ambiguous chat agent as in Study 1. The study aimed to see if the type of product influenced their perception of the chat agent’s identity.

As predicted, participants inferred the agent to be human more frequently when shopping for personal lubricant compared to body lotion. This demonstrated that the nature of the product could activate self-presentation concerns, affecting how consumers perceive and interact with customer service agents.

Study 3 shifted the focus to clearly identified chatbots and human agents. A large sample of 1,501 participants was asked to imagine buying antidiarrheal medication and interacted with either a non-anthropomorphized chatbot (a chatbot without human-like features), an anthropomorphized chatbot (a chatbot with human-like features), or a human service rep.

Participants showed a higher willingness to engage with the non-anthropomorphized chatbot compared to the human agent, particularly when the purchase context involved potential embarrassment. However, this preference diminished when the chatbot was anthropomorphized, indicating that giving chatbots human-like qualities can make consumers feel similarly judged as they would by human agents.

Study 4 delved deeper into how self-presentation concerns influenced perceptions of a clearly identified anthropomorphized chatbot versus a human agent. Participants were asked to imagine purchasing a personal lubricant and rated the chatbot or human agent on perceived experience (the capacity to feel emotions and have consciousness).

Those with higher self-presentation concerns ascribed more experience to the anthropomorphized chatbot, despite knowing it was not human. This finding suggested that anthropomorphism introduces ambiguity about a chatbot’s human-like qualities, affecting consumer comfort levels.

Studies 5a and 5b involved real interactions with chatbots. In Study 5a, 386 undergraduate students were asked to choose between two online stores, one with a human service agent and one with a chatbot, for purchasing either antidiarrheal or hay fever medication. Participants preferred the chatbot store for the embarrassing product (antidiarrheal medication) and the human store for the non-embarrassing product (hay fever medication). This choice was mediated by feelings of embarrassment, as indicated by participants’ spontaneous explanations.

Study 5b involved 595 participants interacting with a real chatbot about skincare concerns. Participants were more willing to provide their email addresses to the chatbot than to a human agent, a behavior mediated by reduced feelings of embarrassment when interacting with the chatbot.

“In general, research shows people would rather interact with a human customer service agent than a chatbot,” said Jin, who led the study as a doctoral student at Ohio State’s Fisher College of Business. “But we found that when people are worried about others judging them, that tendency reverses and they would rather interact with a chatbot because they feel less embarrassed dealing with a chatbot than a human.”

While the study offers significant insights, it has some limitations. The reliance on self-reported measures and hypothetical scenarios in some of the studies may not fully capture real-world behaviors. Additionally, the focus was mainly on specific embarrassing product categories, which may not generalize to all types of products or services.

Nevertheless, the findings have some practical implications. Companies should consider these findings when designing their customer service strategies, especially for products that might cause consumers to feel self-conscious. By clearly identifying chatbots and avoiding excessive anthropomorphism, businesses can improve customer comfort and engagement.

“Chatbots are becoming more and more common as customer service agents, and companies are not required in most states to disclose if they use them,” said co-author Rebecca Walker Reczek, a professor at Ohio State’s Fisher College. “But it may be important for companies to let consumers know if they’re dealing with a chatbot.”

The study, “Avoiding embarrassment online: Response to and inferences about chatbots when purchases activate self-presentation concerns,” was authored by Jianna Jin, Jesse Walker, and Rebecca Walker Reczek.

© PsyPost