AI vs. Your Therapist.

How Using AI To Challenge The Therapeutic Process Can Harm Your Care.


By Dr. Gregory Lyons, PsyD, LCPC.
12/3/2025.

Artificial intelligence tools have become woven into everyday life, and many clients now use them as part of their personal reflection process. Increasingly, I’m hearing clients say, “We discussed this issue in session, and then I asked Perplexity/Rocky/ChatGPT the same question to see what it would say.”

Some clients may use an AI engine as a second opinion. Some clients use it to compare the therapist's advice and assess its validity. This can be viewed as the client using AI to “check the therapist’s work,” even if they don’t consciously frame it that way.

On the surface, this can make sense to many clients. AI is convenient, emotionally neutral, available at 2 a.m., and rarely confrontational. It offers an accessible space to think through ideas. But it can create a subtle triangulation between the therapist, the client, and the AI. At times, it can either support therapy or quietly undermine it, depending on the client’s acceptance of what is being presented.

So the real question is not whether clients “should” use an AI engine, but why they turn to AI after therapy at all.

AI Doesn’t Know You, Even If It Sounds Like It Does.

A therapist carries an evolving map of the client. This map, or therapeutic plan, identifies and tracks patterns, defenses, trauma responses, attachment style, emotional rhythms, and history. A therapist watches body language, notices voice shifts, tracks avoidance, and evaluates readiness. Recommendations are not random; they are rooted in relationships, timing, and clinically informed strategy.

AI, by contrast, only sees text. It does not know the client’s past. It does not see the emotional context. It does not track the treatment plan. It cannot sense distress. It cannot adjust its guidance based on your trauma history or your developmental stage. It cannot assess risk. It cannot know whether answering a question directly will destabilize the client or reinforce a maladaptive pattern.

AN AI Generates Content While A Therapist Generates Direction.

When a therapist offers guidance, and the client later receives a different suggestion from AI, confusion and a lack of confidence in their therapist's process can follow. Not because the AI is “wrong,” but because AI is working without context. AI cannot know if the therapist is intentionally slowing the pace, intentionally challenging a distortion, or purposely avoiding deeper content until the client is more stable.

Where an AI’s advice can be generalized, the therapist’s guidance is often tailored.

This mismatch can lead clients to doubt the plan, question the therapist’s approach, or wonder which voice to trust. When this doubt goes unspoken, therapy can lose momentum without anyone realizing why.

Sometimes Turning to AI Is Completely Valid.

Sometimes the use of AI is not a form of avoidance or a challenge to the therapist's competence. Sometimes it reflects the client's pursuit of clarity or a healthy curiosity about their diagnosis.

Clients can sometimes consult an AI to:

When an AI is sued in this way, it can possibly strengthen the therapy process. It helps clients ask more informed questions, engage more actively, and articulate their needs more clearly.

But Sometimes the Motivation Is More Complicated.

When a client turns to AI instead of turning to the therapist with their doubts or questions, therapy loses its collaborative edge.

A client may ask AI because they want reassurance.

They may hope for a different answer than the one the therapist provided.

They may be testing whether the therapist’s recommendation matches what an outside source would say.

They could feel unsure that their therapist fully understands them.

They can feel uncomfortable with the emotional weight of the therapist’s guidance.

And Then There’s the Sugar-Coating Problem. AI is designed to be gentle.

It softens, validates, avoids confrontation, and presents options in reassuring language. It rarely states, “You are avoiding something.” It does not challenge distorted thinking directly. It does not push clients toward discomfort that leads to growth. It does not say, “This pattern is keeping you stuck.” It is the job of the Therapists to do so. The therapist may be trying to build personal growth or change in destructive behavioral practices that the client may not be able to understand or identify.

By doing this, the therapist may say the difficult thing that the client doesn’t want to hear but needs to confront.
So Why Ask AI at All?

When a client engages an AI to question what they have covered in therapy, there could be more to the function behind it.

Some possibilities include:

If You Don’t trust the Therapist, there is a different tissue entirely.

Sometimes, the use of AI is not about avoidance, but it can be the client's uncertainty in the therapeutic relationship. If the client feels that the therapist’s approach consistently feels misaligned, confusing, or emotionally unsafe, then the client should absolutely question if the therapeutic approach is right for them. Therapy relies on trust, attunement, and clarity. When those break down, it is healthy and appropriate to reassess whether this is the right clinician.

AI Can Support Your Growth , But It Cannot Guide It.

Therapy is where accountability, relational repair, emotional exposure, and deep pattern recognition occur. A therapist considers readiness, history, trauma activation, emotional safety, and long-term outcomes. These are issues that most AI engines cannot assess.

If an AI response ever contradicts your therapist, the healthiest step is not to choose sides but to bring the difference into the room. A strong therapist is not threatened by comparison; they welcome transparency and can use it to build a stronger relationship with the client, understand where they may have doubts, and maybe realign the treatment plan so the process becomes an enriching, trusting, and open relationship for the client.