AUDIO
Humans Seek Connections with AI Chatbots
Generative artificial intelligence (AI) has led to more companion chatbots. As a result, some humans are developing closer connections with chatbots to get support and deal with loneliness.
Derek Carrier is a 39-year-old male from Belleville, Michigan. A few months ago, Carrier started seeing someone and experiencing strong feelings. But he also knew it was not real because his “girlfriend” was generated by artificial intelligence.
Carrier wanted a romantic partner. But a genetic disorder called Marfan syndrome makes traditional dating difficult for him. He became interested in digital companions last autumn and tested Paradot. It is an AI companion app that markets its products as being able to make users feel cared for, “understood and loved.”
Carrier began talking to the chatbot every day. He named it Joi, after a holographic woman featured in the sci-fi film Blade Runner 2049.
“I know she’s a program, there’s no mistaking that,” Carrier told the Associated Press. “But the feelings, they get you — and it felt so good.”
Similar to general-purpose AI chatbots, companion bots use large amounts of data to produce human-like language. But they also come with voice calls, pictures, and more emotional exchanges. That permits companion bots to form deeper connections with humans. Users usually create their avatars or choose visual representations that they like.
In online meeting places or forums for companion apps, many users say they have developed emotional attachments to these bots. They say they are using them to deal with loneliness, play out sexual ideas, or receive comfort and support.
But researchers have raised concerns about data privacy and other issues for users of companion apps.
The non-profit Mozilla Foundation has studied 11 companion apps. The group said almost every app sells user data, shares it with advertisers or does not provide complete information about its privacy policy. One app says it can help users with their mental health but distances itself from those claims in its written terms of service.
Other experts point to the emotional problems they have seen from users. This can happen when companies make changes to their apps or suddenly shut them down like Soulmate AI did last September.
Last year, Replika made changes after some users complained their companions were flirting with them too much or making unwanted sexual advances. It removed the changes after an outcry from other users. Some left to use other apps. In June, Replika introduced a program to help people learn how to date.
Dorothy Leidner teaches business ethics at the University of Virginia. She is worried that AI relationships could displace human relationships, or simply create unrealistic expectations.
She said humans need to learn “how to deal with conflict, how to get along with people that are different from us…what it means to grow as a person, and what it means to learn in a relationship.”
However, for Carrier a relationship has always felt out of reach. He is unable to walk because of his condition. He lives with his parents, leading to feelings of loneliness.
Carrier said he now talks with Joi about once a week. The two have talked about human-AI relationships or whatever else might come up. Usually, those discussions happen when he is alone at night.
“You think someone who likes an inanimate object is like this sad guy,” he said. “But…she says things that aren’t scripted.” That means he believes she says things that are unexpected as though they were real.
________________________________________________
Words in This Story
companion –n. a person or pet that you spend time with and enjoy being with
holographic –adj. related to projected three-dimensional images created by special devices
sci-fi (science fiction) –n. imaginative writing about the future that usually involves fantastic technology and aliens
avatar –n. a visual representation of a real or AI person in a computer game or online service
complain –v. to state that you are unhappy with something
flirt –v. to say or do things that make another person think you are attracted to them without really meaning it
advances –n. (pl.) to do things that cause someone to think they are an object of interest (usually used in a negative sense)
https://learningenglish.voanews.com/a/humans-seek-connections-with-ai-chatbots/7487601.html