Why Therapists Should Think Twice Before Using AI for Clinical Documentation
The rise of AI tools like ChatGPT and other note-writing platforms has sparked interest and debate across many industries—including mental health. For therapists, the idea of streamlining clinical documentation sounds incredibly appealing. We all know how difficult it is to constantly try to stay ahead on progress notes and treatment plans.
But with convenience comes risk.
In this post, we’ll explore the hidden dangers of using AI platforms for clinical notes, progress documentation, and client-related writing. As therapists, we carry a legal and ethical responsibility to protect our clients’ information and ensure the integrity of their care. Let’s unpack why using AI may not be the best solution, and what you need to consider before integrating it into your practice.
The Risks of AI in Therapy Documentation:
Client Confidentiality and HIPAA Compliance
The most glaring issue with using AI to draft therapy notes is the potential violation of client confidentiality. Most AI platforms are not HIPAA-compliant, which means any client data input into them could be at risk.
Even if you anonymize identifying details, there's still concern about how AI systems store, process, and learn from your input. Documentation platforms may retain and use data for model improvement unless explicitly opted out—and even then, the boundaries are murky.
Tip: Always ask yourself: Would I want this text to end up in a data server I can’t trace or control?
Loss of Clinical Nuance
AI is trained on patterns—it doesn’t truly understand human complexity or therapeutic nuance. It’s a robot. It can’t grasp the subtleties of non-verbal communication, transference, or a client’s unique relational dynamics. It can’t feel emotions or understand subtext.
While it may help with generic summaries, relying on AI for clinical notes can flatten or misrepresent a session. Therapists risk producing documentation that is overly formulaic or lacks the rich context needed for continuity of care, legal protection and for possible insurance payment or reimbursement.
Ethical Considerations and Informed Consent
Using AI for documentation—especially when involving any client-specific information—raises ethical red flags. Do clients know their session details might be run through an AI system? Have they consented? Do they consent to this?
The American Counseling Association (ACA) and other licensing boards have not yet offered firm guidelines on this issue, leaving therapists in gray territory. Until regulations catch up, it’s best to err on the side of caution and transparency.
Potential for Dependence and Deskilling
There’s also a psychological risk for therapists themselves: over-reliance on AI can lead to clinical deskilling. If AI tools start replacing our reflective writing process, we may lose touch with an essential part of clinical practice—integrating and synthesizing what we learn about our clients session to session.
Documentation isn’t just paperwork; it’s a part of therapeutic thinking. Automating it entirely may save time, but it could diminish our clinical insight over time and dull our clinical sense.
Legal Liability and Documentation Errors
Therapists are legally responsible for the accuracy of their notes. If AI generates misleading or incorrect information—and that documentation is later subpoenaed in court—you’re still liable.
AI systems also make mistakes. They can make up facts, misuse terminology, or omit critical details. These errors could be costly in court, audits, or professional reviews.
What Can Therapists Use AI For—Safely?
While caution is warranted, AI isn’t entirely off-limits. Let’s be real, there can be some advantages to using AI and I’ve used some of the these listed below myself. Here are a few lower-risk ways to incorporate AI into your practice:
Brainstorming psychoeducation handouts
Drafting general treatment goals (not client-specific)
Marketing content
Exploring business development ideas
Summarizing articles (as long as they're not behind paywalls)
Always keep sensitive client material out of AI platforms unless you are certain they are HIPAA-compliant and have the appropriate Business Associate Agreement (BAA) in place.
Final Thoughts: Don’t Let AI Compromise Clinical Integrity
Technology is evolving fast, and it’s tempting to lean into the ease of automation. But as therapists, we must remain vigilant in upholding the standards of our field and to protect our field, and our clients. AI can be a helpful tool in the right context, but when it comes to clinical documentation, the stakes are too high to cut corners.
Protect your clients. Protect your license. Use discernment.
Have you tried using AI in your workflow? Tell me your thoughts in the comments below.