Back to blog

5
Min read
•
Mar 29, 2026
AI tools are showing up in more and more mental health workflows—from note-taking and treatment planning to outcome tracking and client communication. But picking the right one takes more than comparing features. You need to think about HIPAA compliance, clinical accuracy, ethics, and how well it fits the way you actually work.
Key Takeaways
Compliance first: HIPAA compliance is non-negotiable. Before adopting any AI tool, make sure it offers a BAA, encrypts data, and has clear policies on storage and access.
Test before you commit: Use a simple checklist—compliance, accuracy, workflow fit, cost—to compare tools. Always try one with real sessions during a trial period.
Stay informed: Keep up with new AI applications in mental health, ethics guidelines from APA, ACA, and NASW, and changing regulations around AI in healthcare.
Why This Matters for Therapists
Not every AI tool is built for clinical settings. Using the wrong one can create compliance risks, documentation errors, and ethical issues. Having a clear way to evaluate tools helps you make smart decisions that protect your clients and your practice.
Mental health work has specific documentation needs, privacy standards, and workflows that general-purpose AI wasn’t designed for. Choosing a purpose-built tool is a clinical decision, not just a tech preference.
HIPAA Compliance and Data Security
This is your starting point. Any AI tool that touches client data—recordings, notes, identifiable information—needs to meet HIPAA requirements.
Questions to Ask the Vendor
Do they sign a BAA? A Business Associate Agreement is legally required for any third party handling protected health information. No BAA = no HIPAA compliance.
Where is data stored? Make sure data is encrypted at rest and in transit, and that servers meet healthcare-level security standards.
What happens to recordings? Find out if audio is stored, how long it’s kept, and whether you can permanently delete it.
Who can see client data? Ask whether the vendor’s team can access identifiable information—and whether data is used for model training.
PHIPA compliance: If you practice in Canada, confirm the tool also meets provincial health information privacy requirements.
Does AI Actually Work for Mental Health?
An AI tool is only useful if its output is accurate and reflects how mental health documentation actually works.
How to Test It
Use your real sessions: During a trial, test with complex cases, co-occurring diagnoses, and different modalities to see how it handles the details.
Check the language: Output should use correct clinical terms and DSM-5-TR language—not generic medical or everyday words.
Look at modality support: If you do CBT, DBT, EMDR, psychodynamic, or ABA work, the tool should reflect your modality’s language and note structure.
Review treatment plans: Plans should include measurable goals, evidence-based interventions, and proper diagnostic links.
Mental Health-Specific vs. General Healthcare
General healthcare AI scribes are built for medical visits—chief complaints, physical exams, medications. Therapy sessions involve formulations, process notes, and relational dynamics that general tools just miss. Always choose tools built specifically for mental health.
How Well Does AI Fit Your Workflow?
Even the most accurate tool is a problem if it slows you down or adds steps instead of removing them.
EMR compatibility: Can you get notes into your system easily, or does it need a lot of copying and pasting?
Session formats: Does it support in-person, telehealth, individual, couples, family, and group sessions?
Customization: Can you adjust templates and formatting to match the way you already work?
Learning curve: How fast can you start using it? Do you need training or onboarding?
Style learning: Does the tool match your writing style over time, or do you have to edit every note?
Cost, Support, and Other Practical Factors
Think about the financial and practical fit before you commit.
Clear pricing: Are costs straightforward? Watch for per-session fees, hidden charges, or confusing tiers.
Free trial: Can you test it with enough sessions to really evaluate? A good trial covers multiple clients and session types.
Customer support: Is help available when you need it? Many therapists work evenings and weekends.
Discounts for early-career professionals: Some tools offer lower pricing for students, trainees, and new practitioners.
Client consent forms: Does the vendor provide ready-to-use consent templates, or do you have to create your own?
Streamline Your Documentation with Berries AI
Documentation shouldn’t eat into the time you spend with clients. Berries AI is an AI scribe built just for mental health professionals. It listens to your sessions and creates HIPAA-compliant notes—SOAP, DAP, treatment plans, and more—in seconds.
Supports CBT, DBT, EMDR, psychodynamic, ABA, and other modalities
Learns your writing style and formatting preferences over time
Works with any EMR system for in-person and telehealth sessions
Free trial: 20 sessions, no credit card required
Start your free trial at heyberries.com and see how much time you get back.
Frequently Asked Questions
Is it ethical to use AI for clinical notes?
Yes—as long as the tool is HIPAA-compliant, clients give informed consent, and you review every note before it goes into the record. You’re still responsible for accuracy.
How do I get client consent for AI documentation?
Add it to your informed consent process. Explain what the tool does, what data it accesses, how it’s stored, and your role in reviewing everything. Many platforms, including Berries AI, provide ready-made consent forms.
Can I use ChatGPT for clinical notes?
General-purpose tools like ChatGPT don’t typically offer BAAs or HIPAA-compliant data handling. Using them with identifiable client information creates serious compliance risks.
What if the AI generates inaccurate notes?
Always review before finalizing. If errors keep happening, report them to the vendor and consider whether the tool meets your standards. You’re always responsible for the final note.
Disclaimer: This article is for educational purposes and professional development only. It does not constitute clinical supervision or replace professional judgment in therapeutic practice.