ACA 2026 made one thing clear: AI is already reshaping counseling

Key takeaways
- The question at ACA 2026 was no longer whether AI can replace counselors – it was how clinicians should adapt to a profession that’s already changing
- Therapeutic alliance remains one of the strongest predictors of treatment outcomes, and it’s something AI cannot replicate
- Clinicians experiencing burnout are adopting AI tools faster than any other group – driven by administrative exhaustion, not tech enthusiasm
- Many practicing counselors are using non-HIPAA-compliant AI tools for clinical work without realizing the risk
- Purpose-built, HIPAA-compliant tools like TheraNest’s AI Session Assistant exist to keep clinicians in control while protecting client data
The conversation about AI and mental health used to center on a rather existential question: Can a machine replace a counselor? At ACA 2026, that question was mostly gone. What replaced it was more practical, and in some ways more interesting.
Here are five things that stood out.
1. The knowledge that matters most has changed
One of the clearest themes across sessions: AI has largely commoditized the retrieval side of clinical knowledge. A counselor who can recall DSM criteria, name every evidence-based intervention for treatment-resistant depression, or generate a structured treatment plan is doing something AI can now do faster.
But that’s a shift in emphasis, not a threat to the profession. What AI cannot do is sit with someone in genuine distress and have that presence matter. It cannot carry the weight of a therapeutic relationship built over months. It cannot apply clinical judgment shaped by thousands of hours of real sessions. Research has consistently shown that the therapeutic alliance – the quality of the relationship between clinician and client – is one of the strongest predictors of treatment outcomes across theoretical orientations and presenting problems.
AI doesn’t compete on that ground. What ACA 2026 suggested is that clinicians who recognize this shift will stop spending energy trying to out-recall an algorithm and start leaning harder into the things that make them irreplaceable: presence, attunement, judgment, and the experienced application of knowledge in relationships. AI is making it clear how important the human side of therapy really is, freeing counselors to focus on the relationship and their clinical approach instead of administrative work.
2. Clinicians are starting to trust AI-assisted treatment plans more than unassisted ones
This one was surprising to most. Counselors at the conference reported that a treatment plan developed with AI assistance now carries more credibility than one developed without it – at least in complex cases. The reasoning isn’t that AI is smarter. It’s that AI-assisted plans tend to be more comprehensive, better organized, and less likely to reflect a single clinician’s blind spots.
Six months ago, this wasn’t the consensus. The shift is recent enough that it’s worth paying attention to. It also raises a question the field hasn’t fully answered yet: as AI-assisted documentation becomes the norm, what happens to clinicians who haven’t adopted it? (This finding emerged from panel discussions at ACA 2026 and hasn’t yet been studied formally — but the consistency of what clinicians reported makes it worth tracking.)
3. AI documentation is being adopted faster than almost any tool in healthcare history
Healthcare is a notoriously slow-moving technology market. Electronic health records took decades to reach broad adoption. Telehealth spent years on the margins before COVID compressed roughly a decade of growth into eighteen months – a widely cited exception.
AI documentation tools appear to be on a similar trajectory, without needing a forcing event to get there.
The reasons aren’t hard to identify. These tools integrate into existing workflows without asking clinicians to change how they practice. They address one of the most acute pain points in the profession – documentation burden – almost immediately. And critically, they leave clinicians in control. The tool assists; the clinician decides.
That combination – low friction, high relevance, preserved autonomy – may be the formula for the next wave of widely adopted clinical technology. Tools like TheraNest’s AI Session Assistant are built around exactly that principle.
4. Burned-out clinicians are turning to AI tools first
According to our Future of Therapy Report, clinicians experiencing the highest levels of burnout are adopting AI tools at the fastest rate, being 17 percentage points more likely than those with low burnout to consider using AI for client intake.
This reframes the adoption story. AI uptake in mental health isn’t primarily driven by tech enthusiasm, but by exhaustion. Clinicians are reaching for these tools because they’re drowning in administrative work that has nothing to do with why they entered the profession.
Burnout among mental health professionals is well-documented and predates AI by decades. A 2018 systematic review and meta-analysis of 62 studies found that roughly 40% of mental health professionals experience high levels of emotional exhaustion. Our own report found that 82% of therapists experience burnout or serious fatigue, with 54% experiencing persistent or frequent burnout. What’s new is that a plausible, scalable intervention now exists, and the people who need it most are finding it first.
5. Many counselors are using AI tools that weren’t built for clinical work
This was the undercurrent running through ACA 2026 that didn’t always make it into formal sessions: A significant number of practicing clinicians are already using general-purpose AI tools like ChatGPT, Claude, or Gemini for real clinical work. Writing session notes. Drafting treatment plans. Researching interventions.
None of those tools are HIPAA-compliant by default. Using protected health information in a general-purpose AI chatbot can violate HIPAA’s Privacy and Security Rules regardless of intent. The U.S. Department of Health and Human Services has issued specific guidance on what covered entities must do before using third-party software and cloud services with patient data.
This isn’t a fringe issue. It’s happening at scale, largely because compliant alternatives haven’t always been easy to find, affordable, or well-publicized. Clinicians using non-compliant tools aren’t being reckless. Many simply don’t know the risk they’re taking. Purpose-built, HIPAA-compliant clinical AI tools — like TheraNest’s AI Session Assistant — exist and are designed to keep clinicians in control of the final note while protecting client data. The profession needs to close that awareness gap with education rather than judgment.
The bigger picture
AI is not arriving in mental health as a distant future event. It’s here, it’s being adopted quickly, and it’s already changing what clinical competence looks like and how clinicians spend their time.
The counselors who will navigate this well aren’t the ones who resist the technology or the ones who adopt it uncritically. They’re the ones who understand what AI can and can’t do and build their practice accordingly.
Which, come to think of it, is the same judgment clinicians have always applied to every tool in the room.
A sincere thank you to everyone who visited TheraNest by Ensora Health at ACA 2026, shared their experiences, and participated in the conversation. As a strategic partner of the American Counseling Association, we had the privilege of bringing this research directly to the people it’s about, and your honesty and thoughtfulness made it richer than we could have shaped on our own.



