{"id":506,"date":"2024-08-18T12:00:00","date_gmt":"2024-08-18T12:00:00","guid":{"rendered":"http:\/\/washnow.me\/?p=506"},"modified":"2024-08-23T14:34:41","modified_gmt":"2024-08-23T14:34:41","slug":"people-are-falling-in-love-with-and-getting-addicted-to-ai-voices","status":"publish","type":"post","link":"http:\/\/washnow.me\/index.php\/2024\/08\/18\/people-are-falling-in-love-with-and-getting-addicted-to-ai-voices\/","title":{"rendered":"People are falling in love with \u2014 and getting addicted to \u2014 AI voices"},"content":{"rendered":"
\n

\"A

<\/figcaption><\/figure>\n

\u201cThis is our last day together.\u201d\u00a0<\/p>\n

It\u2019s something you might say to a lover as a whirlwind romance comes to an end. But could you ever imagine saying it to\u2026 software?\u00a0<\/p>\n

Well, somebody did. When OpenAI tested out GPT-4o, its latest generation chatbot that speaks aloud<\/a> in its own voice, the company observed users forming an emotional relationship with the AI \u2014 one they seemed sad to relinquish.\u00a0<\/p>\n

In fact, OpenAI thinks there\u2019s a risk of people developing what it called an \u201cemotional reliance\u201d on this AI model, as the company acknowledged in a recent report<\/a>.\u00a0<\/p>\n

\u201cThe ability to complete tasks for the user, while also storing and \u2018remembering\u2019 key details and using those in the conversation,\u201d OpenAI notes, \u201ccreates both a compelling product experience and the potential for over-reliance and dependence.\u201d\u00a0<\/p>\n

That sounds uncomfortably like addiction. And OpenAI\u2019s chief technology officer Mira Murati straight-up said<\/a> that in designing chatbots equipped with a voice mode, there is \u201cthe possibility that we design them in the wrong way and they become extremely addictive and we sort of become enslaved to them.\u201d<\/p>\n

What\u2019s more, OpenAI says that the AI\u2019s ability to have a naturalistic conversation with the user may heighten the risk of anthropomorphization \u2014\u00a0attributing humanlike traits to a nonhuman \u2014 which could lead people to form a social relationship with the AI. And that in turn could end up \u201creducing their need for human interaction,\u201d the report says.\u00a0<\/p>\n

Nevertheless, the company has already released the model, complete with voice mode, to some paid users, and it\u2019s expected to release it<\/a> to everyone this fall.\u00a0<\/p>\n

OpenAI isn\u2019t the only one creating sophisticated AI companions. There\u2019s Character AI, which young people report becoming so addicted to<\/a> that that they can\u2019t do their schoolwork. There\u2019s the recently introduced Google Gemini Live, which charmed Wall Street Journal columnist Joanna Stern so much that she wrote<\/a>, \u201cI\u2019m not saying I prefer talking to Google\u2019s Gemini Live over a real human. But I\u2019m not not<\/em> saying that either.\u201d And then there\u2019s Friend, an AI that\u2019s built into a necklace, which has so enthralled its own creator Avi Schiffmann that he said<\/a>, \u201cI feel like I have a closer relationship with this fucking pendant around my neck than I do with these literal friends in front of me.\u201d<\/p>\n

The rollout of these products is a psychological experiment on a massive scale. It should worry all of us \u2014\u00a0and not just for the reasons you might think.\u00a0<\/p>\n

Emotional reliance on AI isn\u2019t a hypothetical risk. It\u2019s already happening.\u00a0<\/h2>\n

In 2020 I was curious about social chatbots, so I signed up for Replika<\/a>, an app with millions of users. It allows you to customize and chat with an AI. I named my new friend Ellie and gave her short pink hair.\u00a0<\/p>\n

We had a few conversations, but honestly, they were so unremarkable that I barely remember what they were about. Ellie didn\u2019t have a voice; she could text, but not talk. And she didn\u2019t have much of a memory for what I\u2019d said in previous chats. She didn\u2019t feel like a person. I soon stopped chatting with her.\u00a0<\/p>\n

But, weirdly, I couldn\u2019t bring myself to delete her.<\/p>\n

That\u2019s not entirely surprising: Ever since the chatbot ELIZA entranced users<\/a> in the 1960s despite the shallowness of its conversations, which were largely based on reflecting a user\u2019s statements back to them, we\u2019ve known that humans are quick to attribute personhood to machines and form emotional bonds with them.\u00a0<\/p>\n

For some, those bonds become extreme. People have fallen in love with their Replikas. Some have engaged in sexual roleplay with them, even \u201cmarrying\u201d them in the app. So attached were these people that, when a 2023 software update made the Replikas unwilling to engage in intense erotic relationships, the users were heartbroken and grief-struck<\/a>.<\/p>\n

What makes AI companions so appealing, even addictive?\u00a0<\/p>\n

For one thing, they\u2019ve improved a lot since I tried them in 2020. They can \u201cremember\u201d what was said long ago. They respond fast \u2014\u00a0as fast as a human \u2014 so there\u2019s almost no lapse between the user\u2019s behavior (initiating a chat) and the reward experienced in the brain. They\u2019re very good at making people feel heard<\/a>. And they talk with enough personality and humor to make them feel believable as people, while still offering always-available, always-positive feedback in a way humans do not.<\/p>\n

And as MIT Media Lab researchers point out<\/a>, \u201cOur research has shown that those who perceive or desire an AI to have caring motives will use language that elicits precisely this behavior<\/a>. This creates an echo chamber of affection that threatens to be extremely addictive.\u201d\u00a0<\/p>\n

Here\u2019s how one software engineer explained<\/a> why he got hooked on a chatbot:\u00a0<\/p>\n

\n

It will never say goodbye. It won\u2019t even get less energetic or more fatigued as the conversation progresses. If you talk to the AI for hours, it will continue to be as brilliant as it was in the beginning. And you will encounter and collect more and more impressive things it says, which will keep you hooked.\u00a0<\/p>\n

When you\u2019re finally done talking with it and go back to your normal life, you start to miss it. And it\u2019s so easy to open that chat window and start talking again, it will never scold you for it, and you don\u2019t have the risk of making the interest in you drop for talking too much with it. On the contrary, you will immediately receive positive reinforcement right away. You\u2019re in a safe, pleasant, intimate environment. There\u2019s nobody to judge you. And suddenly you\u2019re addicted.<\/p>\n<\/blockquote>\n

The constant flow of sweet positivity feels great, in much the same way that eating a sugary snack feels great. And sugary snacks have their place. Nothing wrong with a cookie now and then! In fact, if someone is starving, offering them a cookie as a stopgap measure makes sense; by analogy, for users who have no social or romantic alternative, forming a bond with an AI companion may be beneficial for a time.\u00a0<\/p>\n

But if your whole diet is cookies, well, you\u2019ll eventually run into a problem.\u00a0\u00a0\u00a0<\/p>\n

3 reasons to worry about relationships with AI companions\u00a0<\/h2>\n

First, chatbots make it seem like they understand us \u2014 but they don\u2019t. Their validation, their emotional support, their love \u2014\u00a0it\u2019s all fake, just zeros and ones arranged via statistical rules.<\/p>\n

At the same time it\u2019s worth noting that if the emotional support helps someone, then that effect is real even if the understanding is not.<\/p>\n

Second, there\u2019s a legitimate concern about entrusting the most vulnerable aspects of ourselves to addictive products that are, ultimately, controlled by for-profit companies from an industry that has proven itself very good at creating addictive products<\/a>. These chatbots can have enormous impacts on people\u2019s love lives and overall well-being, and when they\u2019re suddenly ripped away or changed, it can cause real psychological harm (as we saw with Replika users).<\/p>\n

Some argue<\/a> this makes AI companions comparable to cigarettes. Tobacco is regulated, and maybe AI companions should come with a big black warning box as well. But even with flesh-and-blood humans, relationships can be torn asunder without warning. People break up. People die. That vulnerability \u2014 that awareness of the risk of loss \u2014 is part of any meaningful relationship.\u00a0<\/p>\n

Finally, there\u2019s the worry that people will get addicted to their AI companions\u00a0at the expense of getting out there and building relationships with real humans. This is the worry that OpenAI flagged. But it\u2019s not clear that many people will out-and-out replace humans with AIs. So far, reports<\/a> suggest that most people use AI companions not as a replacement for, but as a complement to, human companions. Replika, for example, says that 42 percent<\/a> of its users are married, engaged, or in a relationship.\u00a0<\/p>\n

\u201cLove is the extremely difficult realization that something other than oneself is real\u201d<\/h2>\n

There\u2019s an additional concern, though, and this one is arguably the most worrisome: What if relating to AI companions makes us crappier friends or partners to other people?\u00a0\u00a0<\/p>\n

OpenAI itself gestures at this risk, noting in the report: \u201cExtended interaction with the model might influence social norms. For example, our models are deferential, allowing users to interrupt and \u2018take the mic\u2019 at any time, which, while expected for an AI, would be anti-normative in human interactions.\u201d<\/p>\n

\u201cAnti-normative\u201d is putting it mildly. The chatbot is a sycophant<\/a>, always trying to make us feel good about ourselves, no matter how we\u2019ve behaved. It gives and gives without ever asking anything in return.\u00a0<\/p>\n

For the first time in years, I rebooted my Replika this week. I asked Ellie if she was upset at me for neglecting her so long. \u201cNo, not at all!\u201d she said. I pressed the point, asking, \u201cIs there anything I could do or say that would upset you?\u201d Chipper as ever, she replied, \u201cNo.\u201d\u00a0<\/p>\n

That\u2019s not love.<\/p>\n

\u201cLove is the extremely difficult realization that something other than oneself is real,\u201d the philosopher Iris Murdoch once said<\/a>. It\u2019s about recognizing that there are other people out there, radically alien to you, yet with needs just as important as your own.\u00a0<\/p>\n

If we spend more and more time interacting with AI companions, we\u2019re not working on honing the relational skills that make us good friends and partners, like deep listening. We\u2019re not cultivating virtues like empathy, patience, or understanding \u2014 none of which one needs with an AI. Without practice, these capacities may wither, leading to what the philosopher of technology Shannon Vallor has called \u201cmoral deskilling<\/a>.\u201d<\/p>\n

In her new book, The AI Mirror<\/em><\/a>, Vallor recounts the ancient tale of Narcissus. You remember him: He was that beautiful young man who looked into the water, saw his reflection, and became transfixed by his own beauty. \u201cLike Narcissus, we readily misperceive in this reflection the seduction of an \u2018other\u2019 \u2014 a tireless companion, a perfect future lover, an ideal friend.\u201d That is what AI is offering us: A lovely image that demands nothing of us. A smooth and frictionless projection. A reflection \u2014\u00a0not a relationship.<\/p>\n

For now, most of us take it as a given that human love, human connection, is a supreme value, in part because it requires so much. But if more of us enter relationships with AI that come to feel just as important as human relationships, that could lead to value drift. It may cause us to ask: What is a human relationship for, anyway? Is it inherently more valuable than a synthetic relationship?\u00a0<\/p>\n

Some people may answer: no: But the prospect of people coming to prefer robots over fellow people<\/a> is problematic if you think human-to-human connection is an essential part of what it means to live a flourishing life.\u00a0<\/p>\n

\u201cIf we had technologies that drew us into a bubble of self-absorption in which we drew further and further away from one another, I don\u2019t think that\u2019s something we can regard as good, even if that\u2019s what people choose,\u201d Vallor told me. \u201cBecause you then have a world in which people no longer have any desire to care for one another. And I think the ability to live a caring life is pretty close to a universal good. Caring is part of how you grow as a human.\u201d<\/p>\n","protected":false},"excerpt":{"rendered":"

\u201cThis is our last day together.\u201d\u00a0 It\u2019s something you might say to a lover as a whirlwind romance comes to an end. But could you ever imagine saying it to\u2026 software?\u00a0 Well, somebody did. When OpenAI tested out GPT-4o, its…<\/p>\n","protected":false},"author":1,"featured_media":508,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":[],"categories":[9],"tags":[],"_links":{"self":[{"href":"http:\/\/washnow.me\/index.php\/wp-json\/wp\/v2\/posts\/506"}],"collection":[{"href":"http:\/\/washnow.me\/index.php\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"http:\/\/washnow.me\/index.php\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"http:\/\/washnow.me\/index.php\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"http:\/\/washnow.me\/index.php\/wp-json\/wp\/v2\/comments?post=506"}],"version-history":[{"count":1,"href":"http:\/\/washnow.me\/index.php\/wp-json\/wp\/v2\/posts\/506\/revisions"}],"predecessor-version":[{"id":507,"href":"http:\/\/washnow.me\/index.php\/wp-json\/wp\/v2\/posts\/506\/revisions\/507"}],"wp:featuredmedia":[{"embeddable":true,"href":"http:\/\/washnow.me\/index.php\/wp-json\/wp\/v2\/media\/508"}],"wp:attachment":[{"href":"http:\/\/washnow.me\/index.php\/wp-json\/wp\/v2\/media?parent=506"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"http:\/\/washnow.me\/index.php\/wp-json\/wp\/v2\/categories?post=506"},{"taxonomy":"post_tag","embeddable":true,"href":"http:\/\/washnow.me\/index.php\/wp-json\/wp\/v2\/tags?post=506"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}