AI can’t be your therapist: ‘These bots basically tell people exactly what they want to hear,’ psychologist says

Published On:

Increasingly, people are turning to AI chatbots like Character.ai, Nomi and Replika for friendship and mental health support. And teenagers in particular are leaning into this tech.

A majority, 72% of teenagers ages 13 to 17 have used an AI companion at least once, according to

a new report

by media and tech ratings nonprofit Common Sense Media. Survey respondents said they use AI for conversation and social practice (18%), emotional or mental health support (12%) and as a friend or best friend (9%).

Stream San Diego News for free, 24/7, wherever you are with NBC 7.

AI can be a powerful tool, but it’s no substitute for genuine human interactions, both personal and professional ones, like a therapist, psychologist and researcher

Vaile Wright

said on a recent episode of the ”

Speaking of Psychology

” podcast by the

American Psychological Association

.

“It’s never going to replace human connection,” she said. “That’s just not what it’s good at.” Here’s why.

Get top local San Diego stories delivered to you every morning with our News Headlines newsletter.

AI chatbots were not built to provide fulfilling, long-term interactions, experts say.

“AI cannot introduce you to their network,”

Omri Gillath

, professor of psychology at the University of Kansas,

told CNBC Make It

back in May. It can’t introduce you to new friends or significant others and it can’t give you a hug when you need one.

Money Report

Nvidia CEO: If I were a 20-year-old again today, this is the field I would focus on in college

Trump sent Jeffrey Epstein ‘bawdy’ 50th birthday letter: WSJ

Instead, chatbots were “built to keep you on the platform for as long as possible because that’s how they make their money,” Wright said of the companies that create them. They do that “on the backend by coding these chatbots to be addictive.”

Ultimately, a relationship with a chatbot feels “fake” and “empty” when compared to a relationship with a human, Gillath said.

Therapy and companionship are the top reasons people turn to generative AI and chatbots, according to

Harvard Business Review

reporting. But experts warn that AI cannot — and should not — be your therapist.

“These bots basically tell people exactly what they want to hear,” Wright said. “So if you are a person that, in that particular moment, is struggling and is typing in potential harmful or unhealthy behaviors and thoughts, these types of chatbots are built to reinforce those harmful thoughts and behaviors.”

Another major weakness of this tech is that AI has knowledge, but not understanding.

“An AI chatbot unfortunately knows that some legal drug use makes people feel better,” Wright said. “It gives you a high and if somebody is saying I’m low and depressed, that might be advice it gives. But it doesn’t understand that you don’t give that advice to people in recovery from illegal drug use.”

That difference between knowing and understanding “is actually really critical when we’re talking about the use of these for therapy.”



Want to be a successful, confident communicator?

Take CNBC’s online course


Become an Effective Communicator: Master Public Speaking


. We’ll teach you how to speak clearly and confidently, calm your nerves, what to say and not say, and body language techniques to make a great first impression.


Get started today


.


Plus,


sign up for CNBC Make It’s newsletter


to get tips and tricks for success at work, with money and in life, and


request to join our exclusive community on LinkedIn


to connect with experts and peers.

Also on CNBC

  • 7 science-backed foods that help you feel younger and sharper into your 70s

  • Charlize Theron chose to be single: It’s ‘a sign of strength,’ expert says

  • Can the viral ‘cortisol cocktail’ reduce stress? Doctors weigh in

Leave a Comment