Back to resources

A therapist’s guide to evaluating AI for your practice

A therapist's guide to evaluating AI for your practice

Artificial intelligence (AI) is becoming a part of many professions, and mental healthcare is one of them. You might be hearing more about AI tools designed for therapists and wondering if they can really help. Can they reduce your administrative workload? Are they safe for your clients? How can you tell a good tool from a bad one?

According to our Future of Therapy report, 40% of therapists are already using AI for at least one administrative task. That number is expected to grow by 59% by 2030.  AI tools are becoming more affordable and effective, but with this accessibility comes a lot of options, and they’re not all equal. This guide will help you understand how to use AI for tasks like scheduling, note-taking, and client intake while keeping your practice and clients safe. We’ll give you a clear, simple framework for evaluating and using these tools wisely.

The ground rules: Privacy, laws, and ethics

Before you even consider an AI tool, it’s important to understand the rules. Your clients’ privacy and trust are your top priorities, and using any new technology requires careful attention to legal and ethical standards.

HIPAA is non-negotiable

The Health Insurance Portability and Accountability Act (HIPAA) is the foundation of client privacy in the U.S. When you use any software that handles Protected Health Information (PHI), it must be HIPAA-compliant.

This means the AI vendor must be willing to sign a Business Associate Agreement (BAA) with you. A BAA is a legal contract that requires the vendor to protect your clients’ information according to HIPAA rules. If a vendor won’t sign a BAA, you cannot use their tool. It’s that simple. As of 2025, HIPAA requirements have become even stricter, so make sure your vendor meets the latest security standards for encryption and access controls. If your tool stores data in the cloud, find out where that data lives. For therapists with international clients, confirm compliance with global privacy laws like GDPR. It’s a little like telehealth in that way. While there are many options for therapy you need to choose a video conferencing app that is HIPAA-compliant.

According to our Future of Therapy survey, 79% of respondents cited client data privacy as their top concern with AI. This highlights the growing need for transparency and robust security measures when adopting AI tools in healthcare and therapy settings. Ensuring AI tools align with strict privacy regulations like HIPAA helps protect client trust and prevents potential legal complications. When exploring AI vendors, place client data privacy at the top of your evaluation checklist.

State laws are getting stricter

On top of federal rules, many states are creating their own laws for AI in healthcare. States like California, Illinois, Nevada, and Utah now have specific rules about using AI, especially for tools like chatbots. These laws often require you to tell clients when they are interacting with AI and ensure a human therapist is always in charge of clinical decisions. As a therapist, you are responsible for knowing and following the laws in every state where you are licensed to practice.

Your ethical obligations

Even if a tool is legal to use, you still need to think about your ethical responsibilities. AI algorithms can sometimes have biases, meaning they might not work as well for people from different backgrounds. You are still the clinician, and you are ultimately responsible for the care your clients receive. Never let an AI tool make a clinical decision for you. It’s a tool to assist you, not replace your professional judgment.

Ask vendors how their AI works. Is it rule-based or generative? Can they explain how outputs are generated? Transparency builds trust and helps you answer client questions confidently.

AI implementation in four admin tasks

Let’s explore how AI can help with specific administrative tasks that often contribute to burnout. We picked these tasks because therapists told us they use AI for them the most in our Future of Therapy report, and because admin tasks are some of the biggest contributors to therapist burnout. Our report also found that the more burned out a therapist is, the more likely they are to turn to AI tools for help.

1. Scheduling and reminders

How much time do you or your staff spend on the phone scheduling appointments and making reminder calls? AI-powered scheduling assistants can handle this for you 24/7. These tools can let clients book appointments online, send automatic reminders via text or email, and even manage cancellations. About 30% of the therapists we surveyed use AI for scheduling and reminders.

Best practices:

  • Start by mapping out your current booking process to see where the bottlenecks are.
  • Choose a tool that integrates with your EHR and calendar. Some EHRs already let you send automatic SMS reminders, so make sure you’re not sending doubles.
  • Make sure your client consent forms include permission to send text reminders.
  • Pilot the tool with a small group of new clients to track its impact on your no-show rate.

2. Taking notes

Documentation is one of the biggest administrative burdens for therapists. 23% of the therapists we surveyed use AI for writing session notes. Many AI tools can polish simple drafts into clear, concise notes, capturing important client details, interventions, and progress while filtering out unnecessary information. These tools can spot typos or inconsistencies, helping you maintain accurate and professional records.

Meanwhile, AI scribes can help by listening to your session audio (always with client consent) and creating a draft of your progress note. They allow therapists to be more present during sessions, and to spend less time on notes after work.

While these tools claim to save a considerable amount of documentation time, they are not perfect. They can make mistakes, so you must always review and edit every note to ensure it is accurate. You are the one signing off on the note, and you are legally responsible for its content.

Best practices:

  • Always get written consent from your client before recording a session.
  • Be transparent. Let your client know you are using an AI tool to help with notes and show them where the microphone is.
  • Regularly check the AI-generated notes for accuracy to catch any patterns of errors.

3. Client intake

The intake process often involves a lot of paperwork and back-and-forth communication. AI can streamline this workflow. For example, an AI tool can guide a new client through filling out demographic information, signing consent forms, and even completing initial screening questionnaires like the PHQ-9.

If a screening tool shows a client is at high risk, the system can automatically flag them for an urgent follow-up call. This not only saves time but also improves client safety. Some tools can also be used to assess client fit to make sure you have the ability to help them.

As of 2025, 21% of therapists we surveyed use AI for client intake management.

Best practices:

  • Choose an AI platform that integrates with your EHR and allows you to customize intake forms.
  • Include questions in the intake process about risk factors, and set up automatic alerts for responses that signal urgent needs.
  • Regularly review intake data for accuracy and completeness, correcting errors promptly.
  • Ensure all electronic consent and screening forms meet HIPAA and state privacy requirements.
  • Test the system before rolling it out to catch any workflow issues.
  • Provide clear instructions and support for clients who may not be comfortable with digital forms.

4. Assessment and screening

AI can also help you track client progress over time. Some tools automatically send out validated assessments (like the GAD-7) to clients and then score them and display the results in a dashboard. This makes it easier to use measurement-based care in your practice.

However, use these tools with caution. Never use an AI score as a replacement for your clinical judgment. Think of it as another piece of data to help you and your client understand their progress.

Best practices:

  • Use AI-powered screening as a supplement to, not a replacement for, standard assessment tools. 
  • Always maintain the ability to override the AI’s suggestion. 
  • Document in the client’s chart that you used an AI-assisted score and confirmed the finding with your own clinical assessment. 

A 10-point checklist for choosing the right AI tools

With so many options out there, it can be hard to know where to start. Use this checklist to evaluate any AI tool you’re considering for your practice.

Define its purpose: Be clear about what you want the tool to do. Is it for an administrative task like scheduling, or is it for a clinical task like screening? This distinction is important for legal reasons.

Confirm HIPAA compliance: Ask the vendor for a signed BAA that meets the latest security standards.

Check security: Ask if the company has a SOC 2 Type II report. This is an independent audit that verifies they have strong security practices.

Professional liability and insurance: Some malpractice insurers now ask about AI use. Confirm that your coverage includes AI-assisted documentation or screening to avoid gaps in protection.

Protect client data: Get a guarantee in writing that your clients’ data will not be used to train the company’s AI models and that it can be permanently deleted.

Demand audit logs: The tool must keep a detailed log of who accesses information and when. This is a HIPAA requirement.

Ask about fairness: Ask the vendor if they have tested their tool for biases related to race, gender, or culture.

Look for proof: For tools that help with screening or assessment, ask for studies that prove they are accurate and effective.

Ensure it fits your workflow: Does it work with your current EHR, calendar, and other software? If not, it might create more work than it saves.

Understand the full cost: The price isn’t just the monthly subscription. Ask about any extra fees, such as tokens or integration costs.

Have a plan: Before you roll it out, have a plan for training your staff, getting client consent, and what to do if the tool stops working. 

One of the easiest ways to get compliant AI tools is to use AI already integrated into a reliable EHR. These AI tools are often tailored for medical use, compliant with privacy laws, and integrate with your workflow without the need for workarounds. Of course, you should still do your due diligence and make sure the AI providers they work with are aligned with your practice’s needs.

Your 90-day plan to get started

Feeling ready to explore AI? Here’s a simple, 90-day sprint to help you get started.

  1. Weeks 1-2: Plan. Identify your biggest administrative headaches. Talk with your team if you have one, and draft the consent forms you’ll need.
  2. Weeks 3-6: Pilot. Choose one tool to start with, like a scheduler. Roll it out to new clients only. Have one therapist try an AI scribe and track its accuracy.
  3. Weeks 7-10: Review. Look at the data. Has your no-show rate improved? How much time are you saving on notes?
  4. Weeks 11-12: Decide. If you’ve seen a clear benefit and had no security issues, you can plan a full rollout. If not, you can adjust your approach or try a different tool.

        When deployed properly, AI can help therapists be more present, save time, and alleviate their administrative burden. It’s a powerful tool, but with great power comes great responsibility. Ultimately, like all these tech solutions, you want to make sure it actually helps you do less busy work. Properly evaluating AI before you implement it in your practice can help you ensure you use AI responsibly and efficiently.