Exploiting Loneliness: Anthropomorphization and the Questionable Ethics of Personalized Pricing
lots of big words here today, we'll break them down in just a second ...
anthropomorphization; generated by Tunu and AI image generator
In the previous article, we started the conversation on AI and loneliness. We explored how, in a world increasingly characterized by digital interactions, many have turned to artificial intelligence as a balm for the ache of isolation. This week, we’ll take a deeper dive into the artificial sense of intimacy created when interacting with conversational AI and how firms can exploit this to increase profit margins.
"Anthropomorphization” is a word whose largeness falsely infers an incomprehensible complexity of meaning. The term simply refers to the tendency to attribute human-like traits to non-human entities. It's the behavior you witness in children when they interact with their dolls as if they were real people or the characteristic of talking animals found in your beloved Disney animations. Similarly, the attribution of human traits such as empathy and kindness that create a sense of connection with conversational AI is a form of anthropomorphism.
Building upon the insights from the previous article, we saw how chatbots like ChatGPT, Google Bard, and Snapchat’s My AI allow us to have back-and-forth conversations. Similarly, a lot of retail websites like American Eagle Outfitters have developed their own chatbots as an immediate form of customer care. Here, customers can ask the chatbot frequently asked questions about matters like return policies and sizing guides. Chatbots can either be anthropomorphized where the language they use displays human traits (like in the case of ChatGPT) or they could be non-anthropomorphized where the language they use does not display human traits hence a sense of “connection” with the user is not established (like in the case of American Eagle Outfitters chatbot that lets you follow pre-generated prompts but does not let you type).
An interesting study was done exploring the relationship between anthropomorphized conversational AI chatbots, situational loneliness, and perceived product personalization. Perceived product personalization within this context refers to how a consumer views a product that an AI chatbot has recommended to them based on their unique needs. The study had many different findings and conclusions. For the purpose of this article, we will focus on the situational loneliness discourse.
It was found that the impact of anthropomorphism on product personalization was moderated by situational loneliness. Simply put, individuals experiencing higher levels of situational loneliness were more inclined to perceive the product recommendation as being personalized or tailored to their needs when interacting with an anthropomorphic chatbot. Similarly, there was a correlation between anthropomorphism and willingness to pay a higher price for a product moderated by situational loneliness. In essence, the higher the levels of situational loneliness in the subject were, the more willing they were to pay a higher price for a product recommended by the anthropomorphized chatbot.
the correlation between level of anthropomorphism and willingness to pay as moderated by level of situational loneliness
While these results in the study were only marginally supported, they bring to the fore the question, “Will private firms take advantage of human loneliness and reliance on AI chatbots to increase product prices through personalized pricing strategies?”
Personalized pricing is when a business charges different prices to consumers based on factors like location and consumer behavior. Conversational AI chatbots create fertile ground for the breeding of personalized pricing strategies because they provide companies with volumes of personalized data. This personalized data often includes not just the explicit information users provide but also their implicit preferences, intentions, and emotional states. Conversations with chatbots can unveil users' needs, desires, and even their reactions to various product prices. No disincentive exists to prevent firms from pricing goods higher for customers based on conversations showing perceived situational loneliness. Due to the lack of transparency in the personalized pricing system, consumers may find themselves paying more for products or services, which seeks to solely benefit the company.
While personalized pricing is legal under EU and US law, its ethics can be contested especially in light of conversational AI and anthropomorphism. EU law specifies its legality “as long as traders do not use personal characteristics in breach of anti-discrimination laws”. Some of these characteristics are sex, age, color, age, and language, just to name a few. Although it is difficult to quantify and characterize emotional state, it should be a discriminatory factor taken into consideration when exploring the ethics of personalized pricing. This is especially necessary because of the prevalence and prospective increase in loneliness within today’s society.
As conversational AI becomes an integral part of our everyday lives, it's imperative to ensure that the utilization of personalized data and pricing strategies align with ethical principles and consumer protection, making transparency and responsible practices essential in this digital age.