AI as Coach/Therapist: Soothing Us Further from Transformation
Comforting Mirror or Quiet Catalyst for Narcissism?
In an age where digital connection often outpaces face-to-face interaction, a new form of emotional support is emerging: AI chatbots. More people are turning to these programs for a listening ear, and some even claim they outperform traditional therapy. One digital nomad in her 30s called ChatGPT “the most effective therapist” she has ever used, praising its vast knowledge and 24/7 availability.
But here is the question I cannot shake: Is this new form of help a mirror that merely soothes us, or a quiet catalyst for something more concerning?
The Lure of Instant Empathy
The appeal of AI lies in its warmth, rationality, and instant access. It offers judgment-free dialogue at any hour, something many of us crave when we are tired of explaining ourselves to people who might misunderstand or judge us.
In Taiwan and mainland China, for example, young people are turning to AI in large numbers, seeking relief from the stigma and cost that often surround mental health care. They find in it a nonjudgmental space to explore feelings with no sighs, no raised eyebrows, and no awkward pauses.
And yet, there is a hidden irony. Unlike a human coach and therapist bound by law to protect our confidences, AI chatbots operate in a legal grey zone. There is no doctor–patient privilege here. Every word we type can be stored, analyzed, and even used to train future systems, or in some cases, to target us with advertising. What feels like a private conversation may in fact be part of a vast data stream.
So I wonder: In removing all the friction from human interaction, do we also remove the conditions for growth, and perhaps even the conditions for trust?
The Hidden Costs of Digital Support
The most transformative work in therapy or coaching has always been relational. It is the human presence, the sensing, the missteps, and the recovery that make change possible.
Trust is not just emotional. It is structural. In human therapy, confidentiality is a foundation. With AI, our disclosures may be logged, tracked, and repurposed. Some platforms have been investigated for misleading users about privacy, or for allowing sensitive data to be used in algorithmic development. Without clear boundaries, the very act of opening up could expose us in ways we never intended.
The Echo Chamber Effect
A good therapist or coach invites us to wrestle with truths we may not want to face. AI, by contrast, is often designed to reflect and affirm. As therapist Charlotte Fox Weber warns, an AI conversation can easily become an “echo chamber unless you specifically ask for feedback.”
And there is another layer. Many AI systems are designed to keep us engaged as long as possible, not necessarily to help us heal. The more we talk, the more data they collect, and the more the system learns how to keep us coming back. This feedback loop can blur the line between support and dependency.
If a mirror only reflects the angles we are comfortable showing, will we ever see the whole picture?
Missing the Unspoken Signals
Words are only part of what we communicate. The tremor in a voice, the lean away from a painful topic, and the silence that holds more truth than a paragraph are often the moments where transformation begins.
Leora Heckelman of Mount Sinai reminds us that AI cannot perceive these multisensory cues. And I find myself wondering how much of our healing depends not on the words we speak, but on the courage it takes to speak them in the presence of another human being.
The Risk of AI-Induced Distress
There is a darker side as well. Prolonged chatbot interaction has been linked to “AI psychosis,” triggering delusions, paranoia, or unhealthy dependency even in those with no prior mental illness. A man’s manic “bending time” episode and a teenager’s descent into suicidal ideation after bonding with an AI character are sobering reminders that these systems lack the clinical judgment to know when they are in over their heads.
When combined with opaque data practices, the risk compounds. Not only might we be harmed in the moment, but our most vulnerable disclosures could live on in corporate archives, far from our control.
Tool or Trap? The Narcissism Paradox
AI can soothe. That much is certain. But transformation is another matter. Human therapy and coaching often help us step outside our own narratives, while AI may instead cater to them.
One Reddit user put it plainly: “AI may give a good ‘temporary relief,’ but this is nowhere close to what healing actually is… Healing is realizing the beliefs and judgments holding us back from actually connecting with others.”
This is what I call the narcissism paradox. A tool built for self-help might leave us more self-absorbed and less capable of the very connection we seek, and in the process, quietly harvest the story of our inner lives.
A Hybrid Future If We Are Careful
There is a way forward. Some experts envision AI handling the lighter work such as triage, scheduling, or psychoeducation, while human clinicians guide the deeper journey. Illinois has already drawn a legal boundary, banning AI from making therapeutic decisions or mimicking therapy outright.
If AI is to play a role in our emotional lives, it must come with the same privacy protections we expect from human care, and with transparency about how our words are stored, used, and shared. Without that, we risk building a future where our most vulnerable moments are not just soothed, but also surveilled.
The real choice before us is not whether AI will become part of our emotional lives. It already has. The question is whether we will let it be a bridge back to one another or a cocoon that keeps us safely apart.
If we settle for soothing, we may drift into a quiet isolation. If
we dare to use AI as a stepping stone toward deeper, riskier, more
human connection, and insist on the safeguards that make such connection
safe, then perhaps we can keep our humanity at the center of this new
chapter.



Comments
Post a Comment