Home > Health > Expert Contributor

AI as Therapist: Substitute or Complement to Human Psychology?

By Emmanuelle Brunet - Kalmy
CEO

STORY INLINE POST

Emmanuelle Brunet By Emmanuelle Brunet | CEO - Fri, 09/05/2025 - 07:30

share it

Would you share your most intimate problems with an artificial intelligence instead of a human therapist? What seemed like science fiction a few years ago is already happening today: millions of people are talking with chatbots to calm anxiety, organize their thoughts, or receive immediate advice.

1. The Context: A Mental Health Crisis

Mental health is one of the biggest challenges of our time. In Mexico, 1 out of 4 people will experience a mental disorder at some point in their lives (INEGI). Only 2% of total health spending goes to mental health. This is a very low proportion, especially considering the global burden of mental health disorders, and a disproportionate amount of this limited funding goes to operating psychiatric hospitals rather than prevention or community-based care.

The result is a huge gap: millions of people live with stress, anxiety, or depression without access to professional help.

I personally discovered AI after listening to a podcast by Oso Trava with Pepe Aguilar. In the interview, Aguilar shared how surprised he was by the accuracy of AI answers. He even showed a conversation with AI to his therapist, who admitted that there was very little to add. Aguilar then decided to leave traditional therapy and focus only on AI.

In my case it was the opposite: I started with AI, curious after listening to that podcast, and later I looked for a human therapist to complement it. What I liked was that AI allowed me to open up faster than I would have done with a human. The therapist then helped me to go deeper, but with the bases already explored with AI. It is also important to say that my therapy is online: with the traffic in my city, attending in person would simply be impossible.

2. The Promise of AI

Supporters of these tools highlight clear advantages:

●      Immediate access, 24/7.
●      Anonymity, which makes it easier to talk about taboo subjects without fear of judgment.
●      Reduced costs compared to traditional sessions.
●      Prevention, since algorithms can detect early signals of risk in language.
 

Some examples already exist: Woebot Health, with clinical studies validating its effectiveness in anxiety and depression; Wysa, approved by the NHS in the United Kingdom and is integrated into some NHS trusts and talking therapies services; or even the informal use of ChatGPT as an “emotional coach” for thousands of people around the world.

AI can definitely introduce therapy in a very accessible way as long as the right prompts are used. It is important to remember that AI may sometimes try to please you, giving only the affirmations you want to hear. This does not always help. That is why you need very specific instructions to make AI ask questions and push you to reflect, not just to confirm your ideas. If you are looking for personal or professional growth, I recommend the prompts by Juan Lombana: Sometimes the answer is not comfortable, but it helps you to reflect and see different perspectives. He even shares a new one every week in his newsletter.

3. Risks and Ethical Dilemmas

Of course, not everything is positive. Risks are as big as promises: AI does not feel empathy, it only simulates it. Who is responsible if AI gives harmful advice? What about privacy, when such intimate conversations may be used to train models? Another concern is dependency: some people already form emotional bonds with chatbots, as in the controversial case of Replika.

Here is the real question: Can an algorithm be a “therapist,” or just a kind of “first emotional aid?” From my experience, AI is an excellent tool of introduction if used correctly. I do not see it as a replacement, but as a complement, and above all, as a way to start opening up to “someone” who does not judge you. It can also help at the beginning of human therapy, when you share a summary of your habits or thoughts generated with AI to guide the therapist and accelerate the process.

4. The Middle Point: Complement, Not Substitute

The most realistic vision is not to replace psychologists, but to use AI as an initial access layer. It can be the first contact for emotional aid, a filter that refers patients to human professionals when risk signals appear, and a tool for follow-up between sessions.

The clearest analogy is telemedicine: it did not replace doctors, but it did expand access significantly.

5. The Mexico and Latin America Angle

In Mexico and Latin America, the opportunity is even greater. Few solutions exist in Spanish, and fewer are adapted to local culture. The deficit of specialists is even higher in rural areas, where AI could be an invaluable bridge.

It is also fertile ground for startups that develop accessible, safe and culturally close solutions. Insurance companies and employers could integrate emotional-support chatbots in their wellness programs, while governments could explore them in youth prevention initiatives.

The economic impact is clear: The WHO estimates that every dollar invested in mental health generates US$4 in productivity.

6. Closing Reflection

The question is not whether AI will replace therapists, but whether we are ready to see it as an ally in such a human field. The most hopeful future is not to choose between humans or algorithms, but to accept that together they can democratize access to mental health.

I have personally experienced how liberating it can be to talk with an AI to organize thoughts, gain clarity for decisions, or simply find an immediate solution to a concrete problem. Of course it does not replace the warmth of a human, but it does open an immediate space for reflection. If I, as an entrepreneur and mother, have found it useful, how many more people could benefit from it?

 

References:

Medina-Mora, M.E. et al. (2003). The Mexican National Comorbidity Survey (Encuesta Nacional de Epidemiología Psiquiátrica). INPRFM/INEGI.

World Health Organization (2021). Mental Health Atlas 2020. WHO.

Pan American Health Organization (2020). Mental Health in the Americas: Panorama 2020. PAHO/WHO.

Fitzpatrick, K.K. et al. (2017). Delivering Cognitive Behavior Therapy Using a Conversational Agent (Woebot): A Randomized Controlled Trial. JMIR Mental Health.

NHS (2021). NHS Apps Library – Wysa: AI Coach for Mental Health.

The Guardian (2025). https://www.theguardian.com/tv-and-radio/2025/jul/12/i-felt-pure-unconditional-love-the-people-who-marry-their-ai-chatbots

 

 

 

 

 

 

You May Like

Most popular

Newsletter