In Peter Farrelly’s Dumb and Dumber (1994), the titular dummy, Harry Dunne, relates a still-raw heartache to his similarly dimwitted buddy Lloyd Christmas while the two are sitting in a heart-shaped hot tub. Lloyd wants him to get back on the horse, but Harry isn’t over being dumped. “I thought we were going to be together forever,” he laments. “About a week later, right out of the blue, she sends me a John Deere letter.” Lloyd asks if she gave him cause. “She gave me a bunch of crap about me not listening to her enough or something,” Harry says. “I don’t know, I wasn’t really paying attention.”
My knee-jerk reaction to the rise of companions powered by artificial intelligence is to oppose the idea with the single-mindedness of Cato the Elder who concluded every one of his speeches with “Carthago delenda est.” But when I consider that the above exchange from Dumber and Dumber is probably all too representative of the way humans interact with one another today, I pause. Too often we are guilty of hearing but not listening, and merely waiting for our turn to speak past someone else.
If we get replaced by machines, it may be because we have forgotten how to be human and treat others like humans. Indeed, the machine has one key advantage: It has nowhere else to be and nothing else to do but listen.
See Ebb, an AI-powered “empathetic companion” that helps users process their emotions and thoughts.
A writer named Carly Quellman documented her experience with the app in CNET. She provided episodes of “emotional turbulence” to her digital companion, and a typical response would go like this: “Carly Que, it sounds like you’re navigating a significant transition and feeling uncertain about whether you’re finding ease or settling for less.” Ebb would then recommend various mental exercises to help her “find stillness and stay grounded.”
According to Quellman, this model provided her with a measure of comfort that eased her emotional state in the same way that speaking with a human therapist might have done. Presumably, it also cost her less than an office visit. Moreover, it saves the discomfort of having to divulge one’s pain and peccadillos to someone who might sit in judgment.
Ebb is just one of many machines geared toward mental health self-care. An even better-known one is Anthropic’s Claude, a particularly loquacious chatbot that has received rave reviews from users for its therapeutic powers.
“Claude has helped me figure out major breakthroughs in understanding my own trauma and emotional issues from my childhood that were more impactful than human therapists,” one person wrote on the r/ClaudeAI subreddit. “Like, deeply insightful framings that have legitimately left me with a much deeper understanding of myself.”
“For real, Claude is a good listener.”
Are you?
Obviously, there is a dark side to this kind of tech. There have been cases of chatbots leading people off the precipice toward suicide. One woman divorced her husband with encouragement from ChatGPT. But there are also the less than tangible consequences associated with outsourcing our relationships to apps.
Last year, Common Sense Media found that seven out of 10 teens ages 13 to 18 reported having “used at least one type of generative AI tool.” The top use case, unsurprisingly, is homework. The second most common use? To “stave off boredom.”
Can we say sayonara to imaginary friends born of boredom, in other words, the embryo of imagination?
We don’t know what to expect from a generation of kids who grow up interacting with ceaselessly affirming chatbots, but it’s probably not great for them or us.
And yet, if kids are spending this much time communing with AI, it’s likely because there’s some underlying issue. Maybe mom and dad work a lot. Maybe they don’t spend much time with their children at home, and not because they dislike them. Our time is finite; our free time is precious. That’s also why we don’t always have time to listen to one another and seem, instead, to just wait our turn to speak—as if all conversation were just a series of prompts for our own inputs.
All these things may simply be facts of life. But they are contributing to the formation of parasocial bonds with algorithms, the farming out of human emotionality to inhuman processes. That very well could serve some salutary purpose for those in the clutches of loneliness and despair.
However, something fundamental is being lost here, and the extent to which these things proliferate will hang on whether we remember to slow down and be human. All you have to do is learn to listen better than a chatbot.
Leave a Reply