AI in Therapy: Why Your Counselling Session Should Not Be Recorded by Artificial Intelligence
- May 4
- 14 min read

It is rare for me to feel like the old man yelling at the clouds, so please bear with me.
Yes, I drive a car that is 26 years older than I am. Yes, I appreciate things that have character, are crafted, analogue, and don't always make logical sense. But I do not think of myself as anti-technology. I try to stay well read when it comes to modern research, especially in regards to mental health, clinical practice, and the tools that shape my work.
One modern development I have struggled to understand is how aggressively AI tools are being marketed to therapists... and how quickly many therapists are welcoming them into the therapy room! The pitch sounds reasonable at first: less paperwork, less burnout, faster notes, and more time to focus on the client. These are not an inherently bad set of goals. Therapists do carry a lot of administrative weight. Notes, forms, emails, billing, risk documentation, and treatment planning all take time. I understand why a tool promising to reduce that burden would be appealing.
But therapy is not an ordinary meeting.
It is not a podcast, a business call, or a lecture being recorded for convenience. It is a confidential space where people often say things they have barely admitted to themselves, let alone another person. When a tool is invited into that space to listen, transcribe, summarize, or generate a clinical record, I think therapists need to ask a more basic question:
"How does this serve the client?"
That is the question therapists have to ask themselves even with something as simple as self-disclosure. If I mention my aforementioned silly old car (1965 mustang if you were curious), parenting, grief, faith, chronic pain, or anything from my own life, it is not because I feel like talking about myself. It has to serve the client, and it has to support the work. Otherwise, it does not belong in the room.
So when AI scribes are marketed as an anti-burnout solution for therapists, I think we need to slow down. Since when do we make clinical decisions primarily based on what is easiest for us?
Clinical notes can be tedious, but they are not meaningless admin. They are carefully written clinical records. They are supposed to capture what is relevant, protect confidentiality, and avoid creating unnecessary harm for the client later. Good notes require judgment, restraint and an understanding that not everything said in therapy belongs in the chart.
This post is not written to be alarmist. It is written for clients, therapists, and anyone interested in counselling who wants to better understand what may be changing inside therapy rooms. Many of these tools rely on recording, transcribing, or ambient listening before producing a note. Even when these tools are marketed as secure, efficient, or therapist-friendly, clients deserve to understand what is happening with their private information.
AI may have useful roles in some parts in daily life, administration work, or health care. But when it comes to recording, transcribing, summarizing, or generating notes from counselling sessions, I believe we need to be much more careful.
Clients come to therapy to be heard, not to be listened to and harvested for data by technology.

What Are AI Therapy Notes?
AI therapy note tools are software products designed to support or automate clinical documentation. Depending on the tool, they may listen to the session, record audio, transcribe what is said, summarize the conversation, identify clinical themes, suggest treatment goals, or generate a draft progress note for the therapist to review.
Some tools claim not to store audio. Some claim not to use client data to train AI systems. Some are marketed directly to mental health professionals with privacy assurances built into the sales pitch. The details matter, and not every tool works the same way. But for a client, the practical question is much simpler:
"Who, or what, is in the room with me?"
That question matters because therapy is not just the exchange of information. It is not simply content moving from one person to another. Therapy depends on safety, trust, privacy, and the ability to speak honestly. A counselling session is often where people talk about grief, shame, trauma, anger, sexuality, faith, relationship pain, substance use, suicidal thoughts, family history, and the private architecture of a life.
For many clients, these are not polished thoughts. They are half-formed, emotional, hesitant, contradictory, and deeply human. That is part of the work. Therapy gives people room to say things imperfectly before they understand them clearly.
So when AI is added to that process, the concern is not only technical. It is relational.
You may not know whether your therapist uses these tools unless you ask directly.
The Privacy Concern
For clients, AI in therapy is not just a technical issue. It is a trust issue.
Therapy information is not data in any ordinary sense. It can include trauma histories, relationship breakdowns, substance use, suicidal thoughts, workplace conflict, sexuality, religious struggle, grief, family pain, shame, and things a person may not want written anywhere in detail. It is not the same as meeting notes from a staff meeting. It is not the same as a transcript from a sales call.
Therapy information is someone’s private life.

When an AI tool is introduced into that process, reasonable questions follow.
Where is the information stored?
Is it processed in Canada or outside Canada?
Is the audio saved after transcription?
Are transcripts retained?
Can the data be used to improve or train AI systems?
Who has access to it?
What happens if the company is sold, changes ownership, changes its privacy policy, or experiences a breach?
Can the information be deleted?
These are not paranoid questions. They are basic privacy questions.
In BC, privacy guidance and professional guidance increasingly point toward transparency, informed consent, client choice, privacy protection, and clinician accountability when AI tools are used in health care or counselling contexts. While I am thankful for the guidance, it alone is not sufficient. I believe the ethical responsibility goes further than compliance requirements.
The deeper question is not only whether the tool is technically secure. The deeper question is whether the use of that tool changes the therapy room itself.
A Recorded Room Is Not the Same Room
This is the part I think about most, as I see it spoken of about the least.
Even if an AI tool passes every privacy test, a room where a session is being recorded, transcribed, or listened to by software carries a different quality than one where it is not. Clients may hold back. They may avoid the thing they actually came to say. They may wonder, consciously or not, where their words are going, who might read them, how they might be summarized, or how they might be used later.
And while I am using the word “may” carefully here, this is not only theoretical.
I remember this clearly from my practicum days, when recording sessions for supervision was part of the training process. Even with informed consent, even with a clear clinical purpose, and even with clients who wanted to support my learning, the room changed. Clients acted differently, and truthfully I acted differently. The sessions were not as deep as clients were clearly more guarded. After the recorded sessions, many confessed that they had filtered themselves more than during usual sessions. One client even said something I have never forgotten:
“I wanted to go easy on you because you have already helped so much.”
That moment has stayed with me because it showed how relational and deeply human therapy is. The client was not only thinking about themselves. They were thinking about me, the recording, my supervision, and how their words might affect the process. While it was a kind act, it also changed the work and lessened what the client got out of the session.
So when we talk about recording, transcription, or ambient AI listening in therapy, we should not pretend this is a neutral addition to the room. It may be quiet, it may be invisible, and it may even be well-intentioned. But it still has the capacity to change the dynamic and efficacy of therapy.
A recorded therapy room is not the same as an unrecorded therapy room.
If a client is filtering themselves because of a recording tool, the work becomes less honest. If the work becomes less honest, it becomes less useful, regardless of how skilled or well-meaning the therapist is.
This is something I think about often in my work with men in particular. For many men, getting to the point of saying something real in a therapy room already takes considerable effort. The language for internal experience often comes slowly, if it comes at all. Adding a recording layer to that space, even a technically secure one, can quietly raise the threshold for what feels safe to say.The thing that was almost said becomes the thing that stays unsaid.
That may sound subtle, but subtle things matter in therapy. Often, the work is happening right at the edge of what a person can tolerate saying out loud. Privacy is not just a nice feature of therapy. It is part of what makes the work possible.
Consent Is More Complicated Than a Signature
In therapy, consent is not the same as clicking “accept” on an app.
There is a power dynamic in the room from the first session. Clients often do not feel entirely free to push back on something that seems like standard practice, especially before trust has been established. A client may agree to AI note-taking because they assume it is normal, expected, or required. They may not want to seem difficult. They may worry that saying no could affect the care they receive. They may not fully understand what is happening with their information.
A signed form does not always mean a client felt free to say no.
This is one of my bigger concerns. Consent can exist on paper while the client still feels pressured in practice.
If a therapist says, “I use this tool so I can be more present with you,” that may sound reassuring. But it can also make refusal feel awkward. A client may wonder, “If I say no, am I making the therapist’s job harder? Will they be less present? Will I seem paranoid or difficult?” In a first session especially, many clients are still trying to understand what therapy is supposed to feel like. Clients may not know what is normal, what is optional, or what they are allowed to question.
That is not a simple consent issue. That is a clinical issue.
I have unfortunately had several clients come in this year to me after learning that a previous therapist had started using AI for notes. I am keeping the details general here to protect confidentiality, but this pattern is important to share. These were not clients reacting to some abstract technology debate, they felt that something private in the therapy relationship had changed. Therapists who they used to trust, and work that used to be effective was no longer so because of a breach of trust in the therapeutic alliance purely because of AI charting.
In some cases, paperwork had been signed. In one case, AI had even been brought up directly. And still, the client felt unsettled, hurt, or betrayed after realizing what it meant in practice.
I share this because the question is not only, “Did the client technically consent?” The deeper question is, “Did the client feel free enough, informed enough, and safe enough to say no?”
There is a difference between a client agreeing because they genuinely want AI involved and a client agreeing because they think this is just how therapy works now. There is a difference between a client understanding the privacy implications and a client signing a form because they are anxious, overwhelmed, or trying to be agreeable. There is a difference between consent that protects the therapist administratively and consent that protects the client clinically.
If AI note-taking makes things easier for the therapist but makes the client less safe, less open, or less honest, then it is not a clinical improvement. It may simply be an administrative improvement at the client’s expense.
For some clients, the injury is not only about privacy. It is relational. They thought they were speaking only to their therapist. Later, they realized a third-party system had been invited into the room, and the room no longer felt the same.
Clinical Notes Can Have Real Consequences
Clinical notes are not neutral administrative documents. Let me say this again for emphasis, clinical notes are not neutral administrative documents.
In certain contexts, they can be requested or reviewed in insurance, disability, legal, employment, or medical proceedings. This is a component of why therapists are trained to document carefully, with clinical judgment about what is relevant, what is contextual, and what could cause harm if taken out of context.
Good clinical notes require discernment. They are not supposed to be a full transcript of everything said. They are not supposed to include every vulnerable sentence, every passing comment, every moment of grief, anger, fear, confusion, or shame. They are supposed to document what is clinically necessary.
AI-generated notes carry specific risks in this area. They may include too much detail, flatten nuance, misrepresent risk, or turn a passing moment in a session into a permanent clinical statement.
The difference between “sometimes I wish I could disappear” and a note implying active suicidal planning is not a small one. The difference between a client honestly exploring their relationship with alcohol and a record that reads as diagnostic is not a small one either. The difference between a client venting about anger and a note that makes them sound dangerous is not a small one.
The more automated the record becomes, the more carefully we need to think about what it actually captures.
A therapy session is full of context. A person may say something hesitantly. They may say it sarcastically. They may say it through tears. They may say it as a passing thought and then immediately clarify what they mean. A therapist uses clinical judgment to understand what belongs in the record and what does not.
Accuracy Matters in Mental Health
I don't know if you have had this experience dear reader, but AI can sound confident even when it is wrong.
I had an experience requesting research for a blog I was writing for my office, and the AI program gave me a false statistic, and cited a false organization. When I challenged it, this is what I received:

That is part of what makes it tricky. AI-generated writing can look polished, organized, and clinically convincing, even when it has made up it's own points.
I have seen physicians online describe AI scribe errors that would be almost funny if the context were not so serious. One example involved an AI note reportedly referencing a successful hysterectomy for a 49-year-old male patient. That kind of error is obvious enough to catch. But in therapy, the more concerning errors may be subtle.
A passing comment can become a clinical theme. A moment of dark humour can be misinterrpted as a risk without an effective safety assessment. A sentence spoken in grief can be recorded as a stable belief. A complex story can be reduced into a category that does not quite fit.
Even when the therapist reviews the AI-generated note, the AI version can begin to shape how the session is remembered. Once a summary exists, it can quietly become how the client is seen and the work that is done from that point forward.
AI does not just capture the clinical story. It can subtly rewrite it.
That does not mean therapists are careless. It means the tool can influence the record before the therapist even begins editing. Especially if the tool is marketed as a quick way to complete documentation and reduce burnout.
Questions You Can Ask Any Therapist Before Starting Counselling
If you are beginning counselling, you are allowed to ask how your information is handled. An ethical therapist should be able to answer clearly and without hesitation. You are not being difficult by asking these questions. You are protecting your privacy.
Here are questions worth asking:
About recording and notes:
Do you use AI during sessions?
Do you record or transcribe counselling sessions?
Do you use AI to write clinical notes?
Do you enter any client information into any AI systems or tools?
About data and storage:
Where is my information stored?
Is audio saved after a session?
Are transcripts retained?
Is my information used to train AI systems?
Who has access to my information?
What happens if there is a data breach?
Can my information be deleted?
About consent and care:
Can I opt out of AI tools?
Will opting out affect my care in any way?
What do you include in clinical notes?
How do you protect client confidentiality?
For more questions worth asking before your first session, this post has a practical list that goes beyond the AI topic, check out 10 Practical Questions to Ask a Counsellor Before You Start.
Clients Come to Therapy to Be Heard, Not Harvested
Therapy should not become another data extraction point, application, or email list to sign up for.
Clients should not have to choose between getting help and protecting their privacy.
They should not have to wonder whether their most vulnerable words are being processed by a company they have never heard of.
They should not have to filter themselves in the one space that was supposed to be designed for honesty.
Clients come to therapy to be heard, not harvested.
This does not mean technology has no place in health care. It means that therapy deserves a higher standard of care, because the material is deeply personal and the relationship itself is part of what makes the work possible.
The therapeutic relationship is not just the container for the work. In many ways, it is the work. If the client no longer feels safe with their therapist and able to speak freely, is it even therapy anymore?
If you are unsure whether counselling is the right fit for you at all, this post may help: How to Know If You Need Therapy (Even If Things Don’t Seem ‘That Bad’.
My Policy at Evan Vukets Counselling
I want to be clear about this, because I think clients deserve to know exactly where they stand before they begin counselling.
At Evan Vukets Counselling, I do not use AI to record counselling sessions, transcribe counselling sessions, generate clinical notes, summarize session content, process client information, analyze client information, in the use of emailing or messaging clients, or storing client information.
No client information, session content, clinical notes, or identifying details are submitted into AI tools.
Your counselling session is not recorded by AI. Your words are not sent to an AI system. Your private information is not used to train or improve AI tools.
Clinical notes are written by me using professional judgment, clinical training, and careful documentation. I secure the electronic files in Personal Information Protection and Electronic Documents Act (PIPEDA) secure software, or if requested, handwritten notes secured in a secure and double locked location. The goal is to capture what is clinically relevant while protecting confidentiality as fully as possible.
I work with clients in Abbotsford, the Fraser Valley, and online across BC, and this policy applies regardless of how or where we meet.
If you are in the process of choosing a counsellor and want to know what else to look for, this post walks through credentials and questions worth considering: How to Choose a Counsellor Near Me in Abbotsford and British Columbia.
FAQ
Do therapists use AI to write notes?
Some therapists are beginning to use AI tools to record, transcribe, summarize, or draft clinical notes. Not every therapist uses these tools, and not every tool works the same way. Clients can ask directly whether AI is used in sessions or documentation.
Can I ask my therapist not to use AI?
Yes. You can ask whether AI is used and whether you can opt out. A therapist should be able to explain how your information is handled and whether declining AI use would affect your care.
Does Evan Vukets Counselling use AI in sessions?
No. Evan Vukets Counselling does not use AI to record sessions, transcribe sessions, generate clinical notes, summarize session content, communicate with clients, or process client information.
Why does AI in therapy raise privacy concerns?
Therapy often includes sensitive information about trauma, relationships, substance use, suicidal thoughts, grief, sexuality, work stress, and family history. When AI tools are used, clients should understand where their information goes, who has access to it, whether it is stored, and whether it may be used to train AI systems.
Technology Should Serve Therapy, Not Reshape It
AI may have a genuinely useful role in some parts of health care and administration. But therapy requires a different level of care, because what happens in that room is private, relational, and often irreplaceable in terms of what it asks of the person sitting there.
"Convenience" for the therapist should not come at the cost of safety for the client.
A counselling session should be a place where you can speak freely, without wondering whether another system is listening, summarizing, storing, or interpreting what you say.
If you are wondering whether now might be the right time to start, this post may be a useful place to begin: When Is It Time to Start Counselling?
At Evan Vukets Counselling, your story stays in the therapy room, not inside an AI tool.
Clients come to therapy to be heard, not harvested.
If you are looking for counselling in Abbotsford, the Fraser Valley, or online across BC, Evan Vukets Counselling offers a confidential space where your story is treated with the care it deserves. A free consultation is available if you want to get a sense of whether it feels like the right fit beyond confidentiality.


