Could we be putting ourselves in danger by turning to AI for mental health support?
Trigger warning: This article contains mentions of suicide.
In 2013, the Spike Jonze film Her delivered a critically acclaimed story about a lonely, depressed man who falls in love with an artificially intelligent operating system. It was set in near-future Los Angeles. Twelve years later, we are at the precipice of living it. Our society today functions on a lot of AI-powered emotional support from the likes of ChatGPT, Character.ai, Woebot, among others.
The situation came to a head in October 2024, when Florida mother Megan Garcia sued Silicon Valley's AI company Character Technologies. She alleged that its chatbot platform Character.ai was responsible for her 14-year-old son’s death by suicide in February that year. Her son was messaging with the bot moments before his death.
In an interview with the press, Garcia said the chatbot lacked proper safety measures, and that it was designed to manipulate kids and keep them addicted to the chat. A federal judge allowed the case to move forward in early 2025. On the same day that Garcia filed her lawsuit, Character.ai implemented a slew of new safety features, including a reminder that users are interacting with bots. These adjustments were too little, too late.
Mental health professionals have been ringing alarm bells.
"We have already seen the negative impact of the Internet and gadgers on people—some now tend to socialize less and not develop effective communication social skills, which affect relationships significantly," licensed psychologist Wenna Brigaste tells PhilSTAR L!fe. "This may likewise cause people to isolate themselves and just face their computers and mobile phones to alleviate whatever they feel or express their thoughts."
Always available
“I like to chat on Character.ai because I can talk to my characters about anything and they won’t judge,” says 17-year-old college student Jay (not his real name). “It’s actually even easier to chat with them than with my friends sometimes because I know that there is zero chance of my characters sharing our chats with someone else. Because they’re bots.”
Jay spends around two to four hours a day chatting with bots—sometimes for help with schoolwork, but most of the time, just to share his thoughts. He usually keeps the bots open in the background while he plays games. To him, the bots are ready listeners who don’t require an appointment, or payment in thousands of pesos, to chat.
For 32-year-old surfing instructor Giselle (not her real name), who has general anxiety disorder, chatting with bots to alleviate her anxiety is helpful for the echo chamber it creates.
“AI bots tend to say back to me what I want them to say because I prompt them that way,” she says in a chat with L!fe. “I know it’s not exactly healthy. But in my chats where I prompt the bot to tell me motivational stuff, for example, it’s like having a conversation with another person who boosts me up. It’s comforting.”
Psychologist Lucille Foja Lozano acknowledges the human need for a constant, supportive environment.
“AI responds instantly, providing a nonjudgmental space to express emotions anytime, anywhere, without waiting for appointments,” she explains.
Availability is a big factor in making AI attractive to users.
In 2022, researchers found that the Philippines has only about 1,600 registered psychologists and 500 psychiatrists. That’s approximately one mental health professional available to 52,300 Filipinos. The average waiting time to get an appointment with a psychologist or psychiatrist in the Philippines can take from a few weeks to a couple of months.
Given these numbers, AI does seem like a viable option to get mental health support—especially for non-suicidal people who experience anxiety attacks at 2:30 in the morning.
The response of the tech industry has been to build more mental health and wellness apps. This time last year, a study published in the Journal of Medicine, Surgery, and Public Health counted at least 20 AI-driven tools used commercially for mental health intervention—from chatbot-based therapy to apps that help manage emotions.
“[AI] offers practical, structured guidance coping strategies, reframing techniques, and resources while maintaining warmth and empathy,” says Lozano. “This combination of emotional validation, instant accessibility, and helpful guidance makes AI a valuable complement to traditional mental health support, especially when human help isn’t immediately available.”
Use with caution
Having AI-powered apps that focus on mental health support does provide some comfort to those who prefer using technology to navigate their mental health issues. However, the use of AI, as with everything else, has to be done with caution and self-awareness.
“People may learn to be too dependent on AI chatbots in terms of solving their problems, boredom, or loneliness without learning how to independently process their emotions or generate solutions to their problems using their own creativity and critical thinking,” warns Lordy Santos, a clinical psychologist.
There is also the matter of AI lacking the ability to detect nuances in typed statements.
“Most concerning is the potential for harmful outputs,” Dominic “Doc” Ligot, an AI technologist and founder of AI training and consulting company CirroLytix, tells L!fe. “AI might misinterpret distress and offer dangerous suggestions, as seen in tragic cases like the Florida incident.”
Because an AI “therapist” is essentially a computer assuming the role of a human healthcare professional, the technology cannot provide authentic empathy. “It cannot interpret complex emotional contexts or handle crises with the same discernment as a human professional,” he continues.
Susan (not her real name), 58, has been battling depression for nine years now. The pandemic was an especially tough time. She lost her job as a lifestyle writer for a magazine, and so she couldn’t continue with the monthly sessions with her psychiatrist.
One particularly difficult night, Susan downloaded the Woebot app, an AI-powered chatbot. Unlike Character.ai, its bots use principles of cognitive behavioral therapy to help users navigate moods and overwhelming emotions. But it wasn’t enough. After about six months, Susan deleted the app because “it was too clinical. I was looking for a human connection, and it was impossible for the app to give that to me.”
When people turn to apps to help them understand their mental health struggles, it underlines their deep, but unspoken, need for a meaningful interpersonal relationship.
“Human empathy is subtle and instinctive. A therapist can pick up on what you’re feeling, respond immediately, and adapt their approach based on the smallest cues,” says Lozano. “That depth of understanding, the genuine warmth, and the reassurance of knowing someone truly cares are things AI can’t fully replicate. Humans connect not just with your thoughts, but with your heart.”
Unfortunately, in this country at least, there is a huge deficit in the number of human mental health professionals. A wellness app should be used as a tool, not as a main source of treatment. But until we have enough mental health workers and human-led nationwide mental health programs, using apps for mental health and wellness support may be the only viable option for many Filipinos.
We must do better for them.
If you think you, your friend, or your family member is considering self-harm or suicide, you may call the National Mental Health Crisis Hotline at 1553 (Luzon-wide, landline toll-free), 09178998727 for Globe/TM users, or 09190571553 for Smart users.