Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124
Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124
The ad is all about romance — finding a partner who understands you and will chat with you whenever you want. You can request photos and send gifts. Your partner will be attentive, emotionally supportive at all times and probably tend to agree with you a lot more than anyone else you know.
You can even choose what your partner looks like, a design-your-own love interest.
That’s possible because, although it feels real, your love interest is a fake.
There are similar promos all over the internet. Artificial intelligence has moved onto the “dating” scene in a big way, with a number of companies offering companionship that can be quite literally bot and sold.
Here’s the big question: Is trading real relationships with all their contradictions and complications for a more compliant one a good idea?
Artificial intelligence and matchmaking aren’t strangers to each other. Initially in the dating realm, AI was being used to fool folks who were already on dating apps into giving up personal information, said Neil Sahota, CEO of ACSI Labs and a United Nations advisor on AI. Back then, bots were creating fake profiles for the gain of those who employed them.
They were also being tested as a tool to conquer loneliness. A decade ago, Sahota was helping build up IBM Watson, a computer system that could answer questions in natural language, as its website says. And a number of companies were already exploring the potential of AI for companionship or for mental health.
It was a real need. At one point, loneliness was the biggest illness in the world, before COVID-19 surpassed it, Sahota told the Deseret News. “About 40% of people suffered from moderate to severe loneliness,” so he and others were exploring what healing role AI could play, maybe as a companion or even a therapy tool. They weren’t pondering it as a substitute for real relationships, but considered its possibilities as an outlet for someone to tell stories to or to build up communication skills or in other ways that would enhance the opportunities to form rich and real, satisfying relationships with others.
At that point, developers were asking questions and exploring things like whether AI could help if someone was distressed at 2 a.m. and couldn’t reach anyone: Could AI talk to them and look for signs they were dangerous or suicidal? If so, could it alert the right people?
That morphed into artificial empathy.
But with the march of time and technology, there now are lots of companion AIs for different reasons, from AI that interacts with an older person who just needs what feels like a caring listener to make-believe love interests.
And that’s where it gets tricky, experts told the Deseret News. Sahota refers to mirror images of good and bad, with the potential for tech to be weaponized in ways that may actually do harm. In some cases AI is becoming a crutch and substitute for relationships, he said, noting that in others it goes further. In Japan, for instance, he said that AI gloves and bodysuits combined with avatars can to some degree replicate the “sensations of a full relationship.”
Sahota calls the advance of AI into relationships worrisome, given that the younger generation seems less keen on face-to-face interactions than previous generations. “You might be heading down the path where people might actively seek the AI substitute because they feel like they won’t get rejected. They won’t be judged. I think (some young people) feel like it’s a safer space.”
Launching in life has challenges like housing costs and the job market. People are delaying both marriage and having children.
“What happens if people don’t ever get married or have kids?” Sahota asks.
Like Sahota, Alexander De Ridder crafts AI tools in his role as cofounder and chief technical officer at SmythOS, a company that helps businesses integrate artificial intelligence. While he’s a fan of AI, he doesn’t want it to replace human relationships. He bluntly calls that “unhealthy” for mental health and human development. He’s no psychologist, he told the Deseret News, but he is a family man with a fully human interest in seeing people thrive.
That AI could disrupt normal relationship development is a real possibility for some, said De Ridder, who notes that young children become wildly attached to inanimate objects like teddy bears and they cry if someone’s not nice to a beloved toy. Adults get attached to objects, too. Lots of people give cars human names. “It is this fundamental capacity — a human capacity to breathe life into things.”
Add that to the fact that people are “a bit obsessed with themselves,” De Ridder said, and see what happens. AI can speak like a human, write like a human … and what’s not to love about something that can be taught to seem obsessed with you? “It’s easy to see how attached we can be to AI that gives validation and reciprocation and support.”
According to De Ridder, teens are drawn to some AI apps, including ones that targets those ages 14 to 20 and especially appeals to females, who may spend hours there. “Boys are not always emotionally available,” said De Ridder, who notes one gender difference is that girls journal more. They are more outwardly emotional, so it’s probably appealing to have “an AI boyfriend who is always there, always pays attention, always offers a thoughtful response.”
A bot that starts out as a virtual friend may evolve into a virtual boyfriend, he said. But will girls even seek a real one?
That potential to stay in virtual reality is even greater, per Sahota, because one of AI’s superpowers is learning a person on a psychological level, adapting to how one talks and what resonates with that individual. Some AI agents “build people at a very deep level, like better than a best friend or even a spouse.”
That doesn’t mean humans and AI will marry, Sahotah says. But they may become smitten. “A lot of things would have to happen and I don’t think we’re anywhere close to that. The robot technology is really far away,” so AI can’t replace human touch very well and touch is a human need.
But using AI as an emotional crutch can still be very problematic.
De Ridder enjoys a good philosophical debate with a bot trained on the writings of Plato and Aristotle. He loves to discuss the world’s modern problems with the ancient philosophers. It’s fun, too, to converse with popular society or historical people via bots that know their written work. But some aspects of life should be between real people, he says.
Robots don’t have their own will or needs, but are programmed to what others want. Being catered to may create selfishness and unrealistic expectations that make it hard to thrive in the real world, De Ridder said. “What do we call people who only have empathy for themselves and their own needs, who expect others to cater to them?” he asks. “We call them pretty bad names.”
Real relationships are messy and De Ridder thinks people are better for that fact. They wrestle and grow and prioritize and compromise, lose sleep and and learn to put others first. The process builds both stronger individuals and societies. The counterweight to selfishness, per De Ridder, is to “foster community, communion, charity, empathy, collaboration and participation. Real tangible interactions with other people.”
Teens especially need to be out relating to other actual people, says De Ridder. “They are lonely. It’s a time of vulnerability” when anything that reinforces isolation and separation from others is not a good idea.
One of the most troubling aspects of design-your-own AI relationships is emotional immaturity, said Jill Manning, a licensed marriage and family therapist in Louisville, Colorado. Without real relationships, “people tend to get emotionally stuck and they don’t develop emotionally.”
“One of the core tenets of maturity is to live in reality and accept reality,” says Manning, who noted that’s also a core tenet of spiritual life. “Not trying to change reality to our whims, but submitting and accepting reality.”
She predicts that “people can pursue — and probably many will pursue that AI illusion — but eventually that illusion will become a delusion and delusion is a facet of mental illness.”
Relationships help us grow. Some can be harmful, of course, she adds, and she sees plenty of those in her clinical practice. But “within the normal scope of human relationships, they help us grown and refine ourself, see our faults and weaknesses, as well as our strengths.” A relationship with no foibles or quirks doesn’t help people grow. “It’s spiritually suffocating.”
For overall wellness, people need human connection.
“Touch is essential,” says Manning. “Young babies can die without nurturing, affirming touch. That speaks to our human DNA need for touch.” Eye contact matters, as does communication. “Our digital devices and technology can certainly be an asset and help us stay connected, but when there’s a total swap out, it’s void of so many nuances of human connection that can’t be replaced.”
She points out the difference between knowing someone and using technology to stay in touch and maintain contact and having a digital-only connection.
Although she’s certified in and often uses teletherapy to treat patients, she prefers the initial encounter be in person. “I learned through experience there’s information I can glean sitting in the same room that I cannot glean over HIPAA-compliant video. I can’t see the foot tapping with anxiety. Even the tone gets lost in video. Video is a pretty good replacement in many situations, but for assessing overall wellbeing, nothing’s like sitting in the same place.”
She adds, “If someone is trying to similarly have friendship or a boyfriend or girlfriend that’s AI generated, it’s an echo chamber — a delusion.” And it may be addictive.
“Anything that can evoke a dopamine reward response in the brain does have the risk of becoming compulsive or getting out of control. Like anything — online shopping, eating, pornography, all of those and this too would have a compulsive potential.”
Manning said it’s important to not get so polarized that we throw potential benefits out.
“As a clinician, I could think of scenarios where AI could serve a very useful purpose when dosed appropriately and in the larger context of other strategies and coping skills.” She could picture value as a practice tool for certain conversations or questions or for learning to be more assertive. It might help in practice for interviews. “I can see value in that, where the risk is low and there are opportunities to self-correct and try again.”
There are valid and valuable uses for artificial intelligence, even regarding relationships. Experts say AI can help lonely older people, who’ve perhaps outlived a spouse or are somewhat homebound.
They are in an entirely different place than young people who might choose AI over human interaction.
Sahota points out that older people “have already had that history. They’ve had relationships and face-to-face interactions.” He sees AI for older adults as an “outlet, not as much a substitute.”
He notes a company that created an AI companion so people can tell stories, which the AI logs to create an autobiography that can be shared with family and friends. It has a practical purpose and helps beat loneliness a bit as older people tell their stories.
Seniors don’t typically use AI for dates. Gen Z and Gen Alpha are a different story. They have less experience with social interaction than with communicating or doing things over devices. Sahota describes young people texting friends “literally sitting right next to each other” at a sporting event. As for those who are worried about rejection or their self-esteem, he said, “I can see how they can easily just switch over to an AI partner, if you will.”
But they might benefit from AI if they used it differently. AI can be a powerful tool to help those same people practice the art of conversation and reaching out to others. It can provide feedback and experience and even recommendations. A company just rolled out an app for couples, in beta testing, that helps analyze a relationship situation. “You’re trying to say this, but it’s coming across like this. I recommend … ”
Listening and logging stories, checking in on mental health and boosting communication skills are all good things, Sahota said. AI’s utility hinges on how it’s used.
Manning agrees AI could help an elderly person who’s isolated and has trouble getting out. “But I think we have to ask ourselves why, as a society, are so many of us lonely. Why would a growing number of people be in a situation where they are so disconnected and so alone and no one’s got eyes or ears on them?”
There are also some similarities with something most people agree is problematic. For years, experts have made one point about pornography: When males can choose what happens, what those engaged in sexual activity look like, and be titillated on the schedule of their choosing, how can a flesh-and-blood female compete?
AI offers a similar question that may be worth pondering. When females can create a relationship that’s always affirming, that emotionally satisfies and is available whenever, how can a flesh-and-blood male compete?
Additionally, both porn and AI relationships draw people into something that’s not real, Manning said.
Porn, she notes, creates a “relational script in which one doesn’t ever have to be authentic, transparent or vulnerable with another person.” Human connection requires learning how to be those things.
Manning said with porn the body responds sexually to pixels on a screen, ignoring that it’s not real. “People are left empty and craving more because there’s never enough. You can’t fill that because you’re not deeply satisfying the human need for real connection. I think with AI as well, there’d be the absence of authenticity, transparency and vulnerability and learning how to practice with that. It would certainly engender a narcissistic posture as well.”
There is, however, a real difference, Manning said. Porn is never helpful to the brain. AI might be, if it’s used wisely and not allowed to cut real relationships out.