Many therapists approach AI documentation tools with caution. That caution is understandable — and, in many ways, it reflects exactly the kind of professional responsibility that good clinical practice requires. Therapy involves some of the most sensitive information people share with anyone.
This article is intended to address the most common concerns honestly and directly — not to dismiss them. The goal is to help therapists ask the right questions about any tool they're considering, understand the relevant distinctions, and make informed decisions about their practice.
Therapy involves highly sensitive, confidential information. Clients share things in the therapy room — and increasingly in telehealth sessions — that they may not share anywhere else. Therapists carry legal and ethical obligations around that information under HIPAA, professional licensing standards, and the ethical codes of their licensing bodies. Those obligations don't disappear when a new tool enters the picture.
The caution many therapists bring to AI tools is not resistance to technology — it's an appropriate extension of existing professional values. The right questions to ask about any documentation tool are straightforward: Who sees the data? How is it stored? Is it used for model training? Does the tool require access to the session itself? How does the tool affect therapist control over the clinical record? These questions have answers, and vendors who can't or won't provide them clearly are worth scrutinizing.
HIPAA applies to any tool that handles protected health information (PHI). A vendor that processes or stores PHI on behalf of a covered entity — which includes most licensed mental health providers — is a "business associate" under HIPAA and should offer a Business Associate Agreement (BAA). The BAA establishes the vendor's obligations around the data they handle. A vendor that handles PHI and does not offer a BAA is not operating in a HIPAA-compliant manner.
One distinction worth understanding: tools that require audio or video access to sessions involve a higher-risk category of data than tools that work from text summaries typed by the therapist after the session ends. Recording raises separate questions about consent, data storage of session content, and the scope of what the vendor can access. Text summaries — which contain only what the therapist chooses to write — present a narrower data footprint. This is not legal advice; clinicians should consult their own counsel or professional organization for guidance specific to their situation and jurisdiction.
There is an important and often overlooked distinction between two different types of AI documentation tools: those that record or transcribe sessions, and those that help therapists structure summaries they write themselves.
Tools in the first category — session recorders and transcription tools — capture the actual content of a session. This raises meaningful questions about client consent, the storage and security of session recordings, what the vendor can do with that content, and how transcription errors might affect the clinical record. These are legitimate concerns that deserve careful evaluation.
AfterSession does not record sessions. The tool works differently: after a session ends, the therapist types or speaks a summary of what happened — in their own words, describing what they observed, what interventions they used, and what the plan is going forward. The AI then structures that summary into a formatted clinical note. No session audio or video is involved at any point in the process.
This distinction matters practically. When a therapist writes a post-session summary, they are exercising clinical judgment about what to document — the same judgment they would apply to any form of note-taking. The AI receives only what the therapist chooses to provide.
A responsible AI documentation workflow keeps the clinician in control at every step. In a well-designed tool, the following should all be true:
Therapist writes the source summary — the AI receives only what the clinician chooses to provide
Therapist reviews the AI-generated draft before anything is saved
Therapist edits or rewrites any section that doesn't accurately reflect the session
Therapist's name and clinical judgment are on the final, approved note
Therapist can reject or rewrite any generated content — nothing is finalized without approval
These are the questions worth asking before adopting any AI documentation tool in your practice:
Does it require session recordings?
Tools that access audio or video from sessions carry more significant consent, privacy, and compliance considerations than tools that work from text summaries provided by the therapist after the session.
Does it offer a Business Associate Agreement (BAA)?
Any vendor that handles protected health information should be willing to sign a BAA. If a vendor doesn't offer one, that's a meaningful red flag for HIPAA purposes.
Does it use submitted content to train AI models?
Therapists should verify that clinical content they submit is not used to train the vendor's AI models. This is a reasonable and important question to ask directly.
Does the therapist review all output before saving?
The clinician should be the final gatekeeper. Any tool that saves notes automatically — without therapist review and approval — removes a critical layer of clinical control.
How is data stored and retained?
Understand where data is stored, for how long, who has access to it, and under what circumstances it might be shared or accessed by the vendor.
Is access limited to the treating clinician?
In a well-designed tool, only the clinician (and any authorized practice members) should be able to access the notes associated with their clients.
Responsible use of AI documentation tools looks a lot like responsible use of any clinical tool: understand what it does, evaluate it against your professional obligations, use it within clearly defined limits, and maintain clinical oversight of the output. AI does not make documentation decisions — the clinician does. The AI structures and formats; the therapist judges, edits, and approves.
AfterSession was built around the principle of therapist control. Sessions are never recorded. The therapist provides the clinical content through a post-session summary. The AI structures that summary into a formatted note. The therapist reviews, edits, and saves — or doesn't. Nothing becomes part of the record until the clinician approves it. That workflow reflects a deliberate choice to keep the documentation process as close as possible to what thoughtful manual documentation already looks like.
For more on how AI fits into clinical documentation, see our AI Therapy Notes Guide and Can AI Help Write Therapy Notes?
This depends on the tool and your jurisdiction. Tools that access session recordings or direct client information may trigger consent requirements. Tools that work from text summaries provided by the therapist after the session — without any recording or direct client input — present a different question. Many therapists and professional organizations treat this similarly to using any other documentation software. That said, practice norms and requirements vary. Consult your licensing board, professional organization, or legal counsel for guidance specific to your practice setting and jurisdiction.
Key questions include: Do you offer a Business Associate Agreement? Do you use submitted content to train your AI models? Does the tool require session recordings? How is data stored, retained, and secured? Who has access to submitted content? Does the therapist review and approve all output before it's saved? Reputable vendors should be able to answer these questions clearly and directly.
Clinical records — regardless of how they were created — can potentially be subject to legal process, including subpoenas. This is true for handwritten notes, EHR records, and AI-assisted notes alike. The key factor is not how the note was drafted, but whether it is part of the clinical record. Therapists should follow existing record retention and confidentiality practices. Specific questions about legal exposure should be directed to a licensed attorney familiar with mental health law in your jurisdiction.
Coverage varies by insurer and policy. Some malpractice policies have been updated to address AI-assisted clinical tools; others have not been specifically amended. The most reliable approach is to contact your malpractice insurance carrier directly and ask whether AI-assisted documentation is covered under your current policy, and whether any disclosure or documentation requirements apply. This is a straightforward question that many clinicians are now asking their carriers.
AI documentation can be used responsibly in therapy practice. The conditions for responsible use are clear: the tool should not require session recordings, the vendor should offer a BAA and not use submitted content for model training, all output should be reviewed and approved by the therapist before saving, and data should be stored securely with access appropriately limited.
Therapist caution about new tools is appropriate and worth maintaining. The right response to that caution isn't reassurance — it's transparency about how a tool works, honest answers to the questions that matter, and a design that keeps the clinician in control. The questions in this article are the right ones to ask. Any tool worth using should be able to answer them clearly.
A complete guide to using AI for therapy documentation. Learn how AI therapy notes work, what formats are supported, and how to maintain HIPAA compliance.
An educational overview of AI-assisted therapy documentation — what AI can help with, what it cannot replace, and how therapists maintain clinical control.
A realistic therapy progress note example with a reusable template, format comparisons (SOAP, DAP, BIRP), common documentation mistakes, and tips for writing session notes faster.
No session recordings. Therapist reviews every note. HIPAA-aligned infrastructure. Start your free trial and see how AfterSession handles documentation.
Start Free Trial