×
Therapists navigate disclosure dilemma as clients turn to AI for mental health advice
Written by
Published on
Join our daily newsletter for breaking news, product launches and deals, research breakdowns, and other industry-leading AI coverage
Join Now

Therapists are increasingly incorporating discussions about artificial intelligence into their client consultations, addressing both their own AI usage in practice and their clients’ independent use of AI for mental health guidance. This shift reflects the growing intersection of AI and mental health care, where practitioners must navigate disclosure obligations while helping clients understand the benefits and risks of AI-driven therapy tools.

What you should know: Mental health professionals face two distinct AI-related disclosure considerations that are reshaping therapeutic relationships.

  • Therapists must decide whether to inform clients about their own use of AI tools within their practice, particularly for therapeutic rather than administrative purposes.
  • They should also address clients’ independent use of popular AI platforms like ChatGPT for mental health advice, given that some portion of ChatGPT’s 400 million weekly active users likely seek therapy-like guidance.

The disclosure debate: Professional opinions remain divided on whether AI usage disclosure should be mandatory or voluntary for therapists.

  • Proponents argue that voluntary disclosure demonstrates cutting-edge practice and can attract tech-savvy clients who view AI integration as an advantage.
  • Critics worry that unprompted AI discussions might unnecessarily alarm clients or create confusion about the therapist’s role versus AI’s capabilities.
  • The disclosure approach may vary based on the client demographic, with younger, digitally comfortable clients more receptive than those less familiar with technology.

Drawing from medical precedent: Healthcare organizations are developing frameworks that mental health practitioners can adapt for their own AI disclosure policies.

  • A recent JAMA research paper suggests patient notifications should cover six key elements: the fact that AI is being used, its functions, how it works, why it improves care, performance monitoring, and patient choices.
  • Medical disclosure frameworks emphasize using “plain and concrete language” and focus on material information that affects patient care.
  • Legal considerations include state laws that may specify required disclosures about AI tool usage.

Sample therapeutic disclosure: Therapists are experimenting with transparency statements that explain their AI integration approach.

  • Example language includes: “As part of my commitment to providing effective, ethical, and up-to-date mental health care, I want to be transparent about how I use artificial intelligence (AI), including large language models (LLMs) and generative AI, in my practice.”
  • Specific AI applications might include organizing therapy notes, generating educational materials for clients, developing worksheets and mindfulness scripts, and handling administrative tasks.
  • Disclosures typically emphasize that “any use of AI tools does not replace my clinical judgment, and I do not use AI to make diagnoses.”

The client AI usage challenge: Many clients are already using AI for mental health guidance without their therapist’s knowledge, creating potential therapeutic conflicts.

  • Clients may receive contradictory advice from AI platforms that conflicts with their therapist’s recommendations, leading to a “tug of war” between human and artificial guidance.
  • Some therapists remain completely unaware of AI’s role in mental health, creating blind spots in their practice when clients seek guidance on AI-driven mental health tools.
  • The lack of AI literacy among some practitioners means they cannot adequately address client questions about AI’s role in mental health care.

Why this matters: The widespread adoption of AI in mental health creates an unavoidable reality that therapists must address proactively.

  • Therapists who remain uninformed about AI risk appearing outdated when clients ask natural questions about AI’s role in mental health.
  • The integration challenge extends beyond disclosure to fundamental questions about maintaining therapeutic relationships in an AI-augmented environment.
  • As one expert noted, the field is moving “step-by-step away from the classic dyad of therapist-client and move into a world of the therapist-AI-client triad.”
Therapists Newest Duty Is To Talk With Their Clients About The Use Of AI For Mental Health

Recent News

Chinese startup Noetix launches $1.4K humanoid robot for consumers

The three-foot robot costs about the same as a flagship smartphone.

7 AI stocks with highest trading volume spark investor interest

High dollar volume suggests institutional interest in companies beyond the usual tech giants.