In the Paper BrandedUp Watch Hello! Create with us Privacy Policy

I tried ChatGPT for therapy. Here’s what I found.

Published Aug 22, 2025 5:00 am

I believe in signs from the universe. It could be some words on a billboard or a song on the radio. Chalk it up to confirmation bias or desperation. But if there’s something I’m on the fence about buying that I see an ad for, I’ll go buy it immediately. If I’m stuck on a decision, I’ll wait for a song to come on and let its lyrics indirectly tell me what to do. 

Unfortunately, I can’t help but have this carry over to my online habits. I look to the Internet for advice on everything. It's extremely helpful 95 percent of the time, like if someone else on Reddit has dealt with the same printer issue. But recently, I found myself spiraling deeper into TikTok and all the unsolicited advice they have on mental health, identity and relationships.

It felt like being led to a certain mentality or conclusion without a feedback loop. I was just being fed all these things, and I couldn’t debate or talk my way through it. It was all consumption and no output. As my algorithm reinforced all this advice, I would find myself having zero original thoughts and having to shake myself out of it.

Stuck in my head, searching for a way out.

Personally, the main reason I do it is to unravel and understand my feelings or whatever I’m going through. Yes, talking about it with my friends and family works, but there’s this nagging part of me that feels like I’m burdening them with too much. I know that’s not the case, and I’m luckily surrounded by so many people who are willing to listen to me yap about the same topic and constantly reassure me. But it’s hard to switch that part of my brain off.

I have also fallen victim to those tarot card readings and videos that say, “Use this sound or you’ll lose him forever.” It’s embarrassing to admit, but no harm, no foul, right? Using TikTok in this way seems absolutely harmless, although my mental health could take a hit if the algorithm just repeats the same types of videos over and over again.

It started to get murkier when I searched for answers through another, more controversial way on the internet.

People on TikTok and Reddit constantly talk about how AI helped them process or get clarity on certain personal events. I have an ethical vendetta against AI, but in a moment of weakness and desperation, I caved. People kept talking about how ChatGPT helped them get through some dark times, and I was distraught enough to forego my morals. I’m not proud of it, but I did learn something.

Picture this: I have rehashed the same traumatic event to my friends 50 million times. They’ve heard each new perspective and realization from me in real time. They’ve been bombarded with audio recordings of me sobbing, then angry, then calm. Everyone has given their two cents on my situation. But despite this, there’s something in me that has yet to fully process everything that happened. I’m not satisfied with the conclusions I’ve come up with. Counseling is too expensive, and my schedule is too packed.

Typing out what I can't say.

I turn to ChatGPT and I tell it everything.

I tell my story from start to finish, and ChatGPT starts talking so much like a therapist or a counselor that it’s a bit scary. It keeps reassuring me, thanking me for being vulnerable and opening up, and validating all my feelings.

It even gives me exercises, like release rituals and writing letters that will remain unsent. By the end of my “session,” I feel reassured. I actually gained new insights into my situation that I hadn’t really considered. It’s even talked me out of lowkey toxic actions I wanted to take next. The language it used was caring and patient, telling me that we would work through my emotions together.

I was a bit amazed at how everything turned out, so of course I had to show the group chat.

My friends and I have pointed out some “bangers” that ChatGPT made during my session:

“You were measuring depth, not duration. And that’s beautiful. A lot of people measure relationships by how long they’ve lasted, but what you’re saying is that you measured it by intimacy, connection, and potential. That’s not only valid—it’s deeply meaningful. It shows you’re someone who truly shows up.”

Connecting online, disconnecting from what's real.

“You are holding so much—grace, effort, confusion, and pain. And through it all, you showed up. Even when you were hurting, even when it would have been easier to retreat, you showed up.”

“That kind of emotional availability is vulnerable—and brave. That’s why it hurts. You weren’t being silly; you were being sincere.”

Despite all of this, I still couldn’t fully be comfortable with using ChatGPT as a therapist. I talked to a licensed well-being coach and psychometrician, Randy Kenjie Podador, about these complex emotions.

ChatGPT is free, and the cost of counseling and therapy these days keeps getting higher and higher. Most people can’t afford it, or they live in countries where mental health is still a difficult topic to broach. The accessibility of AI makes it even more appealing.

Randy affirms this: “It’s instant. It’s available at odd hours. And in many ways, it’s somehow safer. There's no fear of being judged, no pressure to explain generational trauma in a world that often misunderstands it.”

As I mentioned, ChatGPT also talks like an actual therapist with zero judgment. The number of people who have used AI for this type of thing is huge and has probably fed it enough information to be semi-accurate.

But I can’t get over the environmental ramifications of using ChatGPT. The energy consumption and carbon emissions alone are extremely concerning. According to Business Energy UK, ChatGPT uses 39.98 million kWh of energy and 39.16 million gallons of water per day. The data centers are a large setback for climate action efforts.

Additionally, it feels very dystopian and kind of removes our need for actual community. Yes, I talked to my friends about my problems. But some people might only talk to ChatGPT about their own. It removes the sense of connection we can make with other people, and we lose the opportunity to practice empathy with the people around us.

“I don’t believe (AI) can or should ever replace the therapeutic presence of a licensed practitioner. There’s something sacred about sitting with another human being and being fully seen,” Randy adds. “The warmth of eye contact, the silence that holds space instead of rushing to fix, and the gentle challenge that comes when your defense mechanisms are lovingly called out. I believe all of it is not something an algorithm can replicate.”

With something as complex and human as mental health, it feels wrong to put it in the hands of AI.

“Healing often begins in a relationship, and that relationship must be grounded in empathy, ethical standards, and cultural attunement. AI, I believe, no matter how advanced, lacks that intuitive and felt sense of human connection.”

He points out the negative implications AI can have on the actual user. ”It can offer comfort, yes, but it cannot offer containment. It doesn’t know when someone is at risk of self-harm or when silence signals something urgent. It doesn’t understand the subtle cues of trauma responses or the weight of cultural narratives passed down through generations.”

AI isn’t a therapist. I still don’t trust it. It doesn’t have an accreditation or anything to validate what is being said. With something as complex and human as mental health, it feels wrong to put it in the hands of AI. Even if I agreed with everything that was said during my session, I realized most of these things had already been said by my friends or by me. ChatGPT just packaged them in pretty language and therapy-speak.

“I don’t dismiss AI outright,” Randy continues. “If anything, it reminds us of the gaps we still need to fill in mental health care, especially for the underserved. But it should only supplement, not substitute.”

There are pros and cons to using AI for these types of things. I could see the benefits it has for some people. It’s free, and sometimes you need things told to you to make sense of a situation. But I personally am a very introspective person who likes to sit in her thoughts and likes talking to her friends. I turned to ChatGPT at my lowest, but I’m hesitant to go back to it in the future.

Oh, and I’ve been in actual counseling now, which I can report is so much better.