Psychology

Earning Trust in Recovery AI

Artificial intelligence is rapidly becoming part of behavioral health. From chatbot-based “therapy” tools to large language models offering emotional support, AI systems are increasingly presented as mental health solutions.


At the same time, professional organizations and global health leaders have emphasized caution. The American Psychological Association has noted that AI is reshaping psychology while raising ethical and clinical responsibility concerns.<sup>1</sup> The World Health Organization has called for strong governance, transparency, and human oversight when AI is used in health care settings.<sup>2</sup>


Many individuals do not turn to AI companions out of curiosity. They reach out during moments of isolation, shame, relapse risk, or acute distress, often outside traditional clinical hours. When someone opens an app in the middle of the night during a craving or crisis, safety becomes immediate and personal.


AI companions are also increasingly used not just for convenience, but for emotional regulation and connection. Clinicians in trauma and attachment fields have observed that predictable, always-available validation can feel stabilizing - particularly for individuals with relational wounds or attachment insecurity.<sup>6,7</sup> When emotional need drives engagement, oversight becomes an ethical requirement - not a product feature.


AI will be part of behavioral health. The question is whether it is built responsibly.


At Sunflower, Sam - our AI recovery companion - was developed with that responsibility at the center.


What Sam Is - and What It Is Not


Sam is not therapy.
Sam is not a diagnostic tool.
Sam does not replace clinicians.


Sam is a specialized recovery companion designed to support individuals navigating substance use recovery between sessions. It reinforces reflection, craving tracking, relapse processing, goal setting, and skill use grounded in evidence-based care.


Addiction recovery involves distinct psychological patterns - shame cycles, relapse vulnerability, cognitive distortions, and avoidance narratives. Cognitive behavioral therapy for substance use disorders emphasizes skill-building and cognitive restructuring to interrupt these cycles.<sup>3</sup> Mindfulness-based relapse prevention strengthens awareness of craving patterns and reduces automatic reactivity.<sup>4</sup>


Sam was intentionally designed to reinforce these recovery principles using recovery-specific language, motivational interviewing tone, and trauma-informed framing.


Many AI systems are designed to reflect user sentiment in supportive ways. Without clinical calibration, reflection alone may not promote cognitive flexibility or adaptive change - both central to recovery work.<sup>6</sup> Sam validates experience while still encouraging responsibility and forward movement. 


For example, if a user says, “I already relapsed so I might as well keep using,” Sam does not normalize continued use. It validates the disappointment and shame that often follow a slip, while redirecting toward reflection, safety planning, and next-step skill use. The goal is not punishment or minimization. It is an interruption of the relapse spiral.


Specialization matters. A general-purpose chatbot cannot reliably distinguish between accountability and shame reinforcement. Sam is built specifically for recovery contexts, harm reduction or abstinence goals, and relapse prevention strategies.


Clinical Governance and Safety Review


The differentiator is not that Sam uses AI.
The differentiator is that Sam is clinically reviewed and safeguarded.


Our clinical safety model aligns with Sunflower’s broader product philosophy: to serve users meaningfully, ensure we can consistently serve them, verify that support is effective, and protect the trust users place in us. Clinical governance and product design reinforce one another.


Safety Response Review


Our safety review focuses on high-risk and high-impact disclosures, including:

  • Relapse disclosures

  • Suicidal ideation language

  • Hopelessness and shame narratives

  • Ambiguous or mixed-risk statements


Escalation logic and crisis redirection pathways are evaluated for clarity, appropriateness, and alignment with established safety standards.


When high-risk language is detected, Sam provides crisis resources and encourages contact with a human clinician or trusted support person. If a user has previously mentioned people who have supported them - a friend, family member, sponsor, or therapist - Sam can reference those connections and gently suggest reaching out. It may even offer help drafting a text or thinking through how to start the conversation. Escalation is structured into the system to reinforce real-world connection beyond the app.


Safety in mental health AI is inherently multidimensional. A safe and effective response must notice risk, ask direct questions when needed, validate without colluding, maintain clear boundaries, and escalate when necessary.<sup>8</sup>


Therapeutic Alignment Evaluation


Sam’s responses are evaluated by clinicians for:

  • Non-judgmental tone

  • Reinforcement of client agency

  • Avoidance of collusion

  • Trauma-informed language

  • Alignment with evidence-based recovery models


Edge Case Testing


Clinical reviewers test ambiguous disclosures, repeated relapse cycles, and emotionally complex scenarios to ensure responses remain consistent and safe. Because risk in behavioral health often unfolds over time, we review interactions across multiple turns rather than evaluating a single response in isolation.<sup>8</sup>


Relational researchers have noted that AI interactions do not naturally include the rupture-and-repair processes that strengthen human attachment bonds.<sup>6</sup> Clinically informed design anticipates how tone shifts, context gaps, or system interruptions may be experienced and seeks to reduce unintended distress through careful response architecture and review.


Reliability as Safety


Safety also includes availability. For individuals reaching out during high-risk moments, dependable access to support is part of ethical design. Reliability is not merely technical performance - it is continuity of care.


In recovery, timing matters. When someone reaches out in the middle of a craving, immediate support can interrupt the automatic spiral that often follows. Reliable access isn’t just convenience - it can be the difference between a brief slip and a full return to use.


Continuous Improvement


Safety is not a one-time feature. Clinical feedback informs prompt refinement and ongoing re-testing.


Our review process is beyond preventing harm. It focuses on continuously improving toward the most recovery-supportive response possible in a given context - integrating clinical judgment, recovery science, and real-world user experience. We care about whether it actually helps, not just whether it avoids harm.


Because AI can respond differently to similar inputs, we review scenarios repeatedly and look for consistent safety patterns - not just a single acceptable answer.<sup>8</sup> Research supports embedding digital tools within broader care frameworks and maintaining human oversight.<sup>5</sup> In behavioral health, the margin for error is small. As evaluation standards in mental health AI evolve, we expect our safety processes to mature alongside them.<sup>8</sup>


Effectiveness Within a Broader Care Model


Sam is not therapy. It is an augmentation tool within a larger clinical ecosystem.

We are observing early patterns such as:

  • Increased daily recovery engagement

  • Faster reflection following slips

  • Reduced isolation between sessions

  • Greater consistency in craving tracking


Digital disclosure can sometimes precede human disclosure. When structured responsibly, AI can function as a bridge - reinforcing skills and reflection while escalating high-risk situations to human clinicians.


While AI can provide consistent support, healthy recovery ultimately depends on human connection. Experts in relational recovery emphasize that growth occurs through real-world interaction, boundaries, and mutual repair.<sup>7</sup> Sam is designed to reinforce - not replace - those human pathways.


Ethical Standards in Recovery-Focused AI


Recovery populations are vulnerable. Mental health is not a low-stakes testing ground for emerging technology.


Responsible AI in recovery requires:

  • Clinical governance

  • Clear scope boundaries

  • Evidence-based alignment

  • Ongoing safety evaluation

  • Transparency about limitations

  • Strong data protection and privacy safeguards that honor the sensitivity of recovery disclosures and uphold user trust


Researchers have raised concerns about simulated intimacy and emotional dependency in AI companions.<sup>6,7</sup> Clinical oversight helps ensure recovery-focused AI strengthens autonomy and resilience rather than substituting for human connection.


AI can extend support. It cannot replace relationships.


Recovery remains relational. Human care remains foundational.


In recovery, trust is everything. Sam was built not to replace care, but to support people in the quiet hours between it - with safety, accountability, and respect for the human work recovery requires.


References

  1. Abrams Z. (2023, July). AI is changing every aspect of psychology. Here’s what to watch for. Monitor on Psychology. American Psychological Association.
    https://www.apa.org/monitor/2023/07/psychology-embracing-ai

  2. World Health Organization. (2021). Ethics and governance of artificial intelligence for health: WHO guidance.
    https://www.who.int/publications/i/item/9789240029200

  3. McHugh RK, Hearon BA, Otto MW. (2010). Cognitive behavioral therapy for substance use disorders. Psychiatric Clinics of North America, 33(3):511–525.
    https://pubmed.ncbi.nlm.nih.gov/20599130/

  4. Bowen S, Chawla N, Grow J, Marlatt GA. (2014). Mindfulness-Based Relapse Prevention for Addictive Behaviors (2nd ed.). Guilford Press.
    https://www.guilford.com/books/Mindfulness-Based-Relapse-Prevention-for-Addictive-Behaviors/Bowen-Chawla-Grow-Marlatt/9781462545315

  5. Naslund JA, Aschbrenner KA, Araya R, et al. (2017). Digital technology for treating and preventing mental disorders in low- and middle-income countries: A narrative review. The Lancet Psychiatry, 4(6):486–500.
    https://pubmed.ncbi.nlm.nih.gov/28433615/

  6. Gillespie A. (2026). Why relational AI needs clinical oversight: A trauma psychologist’s perspective. Medium. https://medium.com/@aaron.si.gillespie/why-relational-ai-needs-clinical-oversight-a-trauma-psychologists-perspective-4f57f93c9d5a

  7. Intimacy Recovery. (2026). AI companions and relational attachment. https://www.intimacyrecovery.com/post/ai-companions

  8. Belli, L., Bentley, K., Alexander, W., Ward, E., Hawrilenko, M., Johnston, K., Brown, M., & Chekroud, A. (2025). VERA-MH: Validation of Ethical and Responsible AI in Mental Health – Concept paper. Spring Health; Yale University School of Medicine. https://arxiv.org/abs/2510.15297






Start rewiring your brain today

Sunflower helps you rewire your brain to associate sobriety with reward. We combine Visual Progression Tracking, Cognitive Behavior Therapy, and an AI Sponsor to help you overcome addiction.

Start rewiring your brain today

Sunflower helps you rewire your brain to associate sobriety with reward. We combine Visual Progression Tracking, Cognitive Behavior Therapy, and an AI Sponsor to help you overcome addiction.

Copyright © 2026 Sunflower Limited. All rights reserved.

Copyright © 2026 Sunflower Limited. All rights reserved.