Some of your clients may already use AI. They’re asking it about their anxiety, seeking relationship advice, and processing difficult emotions through conversations with artificial intelligence. Instead of viewing this as a threat to your therapeutic relationship, you can learn to work with this reality in ways that benefit your clients’ growth and healing.
Understanding why clients turn to AI reveals important insights about their needs and preferences. Some clients find it easier to explore sensitive topics with tools like ChatGPT before bringing them to session. Others use it to practice difficult conversations or work through thoughts between appointments. And while people sometimes lie to their therapists (more than 70% of clients do, according to Psychology.com), they feel comfortable being honest with machines. The 24/7 availability appeals to clients who struggle with crisis moments outside office hours. AI behaves however you want; unlike people, if you don’t like how AI speaks to you, you can simply change it. And of course, you can’t beat free. Recognizing these patterns helps you understand what your clients value and where gaps might exist in their support system.
There are therapists that incorporate their clients’ AI usage into their therapeutic work by making it a topic of open discussion. Ask clients directly about their experiences with AI. What questions do they ask it? How do they feel after these conversations? What insights emerge from their AI interactions? This information becomes valuable clinical material that deepens your understanding of how clients process information and seek support.
Some clients may benefit from structured approaches to AI usage. For example, they might use it as an alternative to traditional therapy homework like journaling. You might suggest they bring AI conversations to a session for review and discussion. Help them develop critical thinking skills about the responses they receive. Guide them in recognizing when AI advice aligns with therapeutic goals and when it might conflict with their treatment plan. This collaborative approach positions you as a partner in their healing rather than a competitor to their AI interactions.
Boundary setting becomes crucial when clients rely heavily on AI for emotional support. Help clients understand the difference between AI responses and human therapeutic relationships. While AI can provide information and perspective, it cannot offer the attunement, empathy, and relational healing that occurs in therapy. It’s designed to please people, so while it often tells people what they want to hear, it doesn’t always respond in a way that’s actually beneficial to them.
In a piece on Medium, psychotherapist trainee Stephanie Priestley discusses how the therapeutic relationship is unique and can become unhelpful if dynamics shift too much, like becoming overly friendly. This blurring of lines is magnified with chatbots, as boundaries can easily fade when the technology is centered around keeping the user engaged. Maintaining the strength of the therapeutic relationship helps clients understand why using AI isn’t a replacement for therapy.
It’s also helpful to address potential risks associated with AI use without making clients feel ashamed. Some clients may receive advice that contradicts their treatment goals or mental health needs. Others might develop an unhealthy dependence on AI validation. Watch for signs that AI usage interferes with real-world relationships or replaces human connection entirely. These observations become important clinical considerations in your treatment planning.
You can also model healthy AI usage by demonstrating how to ask better questions and evaluate responses critically. Show clients how to provide context that leads to more helpful AI interactions. Teach them to recognize when they need human support instead of artificial intelligence. This educational approach builds their capacity for self-advocacy and informed decision-making.
Consider how ChatGPT usage reflects your clients’ attachment patterns and relationship dynamics. Clients who prefer AI interactions might struggle with vulnerability in human relationships. Those who seek constant AI reassurance may have underlying anxiety about their own judgment. These patterns provide rich material for therapeutic exploration and intervention.
Your therapeutic relationship remains irreplaceable despite AI advances. You offer presence, attunement, and the healing power of being truly seen and understood by another human being. AI cannot replace the safety of the space you create in therapy or the potential for transformation that comes from relational repair. Embrace your unique role while remaining curious about how AI tools might complement your work.
Moving forward, stay informed about AI developments that affect your clients’ lives. Maintain your clinical judgment about when AI usage supports therapeutic goals and when it creates obstacles. Trust your expertise in human psychology and relationships while remaining open to how technology shapes modern mental health experiences. Your clients need your guidance now more than ever as they navigate this new landscape of artificial and human support systems.