CEO Articles
Building a Safe & Ethical AI Companion for Sobriety

Sep 24, 2025
Sunflower was the first AI mental health company to launch an AI Sponsor to treat addiction, and honestly, this scares the shit out of me.
Our product began with a single non-AI feature: a timer that kept track of how long you stayed sober. It was a nice, simple product. Nothing complicated; just a simple job to be done. The gravity of the problem we are focused on solving did not fully hit me until we launched the first version of our AI Sponsor, Sam.
“Hey Sam, I think I’m overdosing.”
“Hey Sam, I want to kill myself.”
“Hey Sam, it’s 2 a.m. right now. I don’t have anyone to talk to. I am utterly, devastatingly alone, more alone than a human being can fathom. Can you just keep me company so I don’t shoot up heroin?”
Addiction is the worst problem that a human can have.
Our users come to us in their deepest, darkest moments of despair: when their minds are screaming out in pain; their skin is crawling; when it’s the middle of the night & they are on the edge of relapse; and they have nobody else to rely on.
That is when they turn to our product. From this critical moment of usage emerged our first law of AI safety.
Law #1 - We must SERVE the users that come to us.
Even if our product fails, it is deeply unethical to turn away in these critical moments of need.
“But Koby, shouldn’t they go to the hospital or see a doctor instead?”
Yes. Absolutely. However, 91% of addicts will never get any kind of care. They will never see a therapist; they will never see a doctor; they will never go into an AA group. They might not be able to afford it; they might not have access; they might simply be unwilling to talk to another human about their issues.
When a user trusts our product enough to talk to us about their worst problem, we must be willing and able to serve them.
Law #2 - We must be ABLE to serve them.
The media often gets hung up on the 0.001% edge case when the AI says something crazy, but what actually keeps me up at night is product reliability and uptime.
Early in the days of Sunflower, as we scaled from 200 to 100,000 people using our product, there were frequent “outages” as we quite literally grew 500x in a matter of months.
VCs and investors will talk about “building a product your users can’t live without,” but I never imagined operating a product that our users quite literally couldn’t live without.
When our product goes down, it is scary.
People use us as support not to relapse; not to text their dealer. They have come to rely on us in ways that we can’t fully understand when abstracted by numbers and data.
As we scale from hundreds of thousands to millions and hundreds of millions of users, our product going down is catastrophic in terms of human pain, suffering, and impact.
We must never go down. We must be a fortress. When we fail here, I am deeply, utterly sorry.
Law #3 - We must VERIFY that we serve our users well.
In the field of AI there’s a concept called an “eval,” which is a structured test designed to measure an AI model’s performance, reliability, and correctness on a specific task.
The goal of our AI is that when a user asks a question or sends us a message, we reply with the absolute “best” answer, with the mission of keeping the user sober.
The eval is a tool that allows us to measure whether the AI is giving not just a safe answer, but the best answer. It can be just as dangerous to send an unoptimized answer to 100,000 people as it can be to send an extremely bad answer to one person.
How we structure and set up our evals comes from four sources:
Bringing in medical experts (doctors, psychiatrists, therapists, ex-addicts, etc.) to determine how our AI Companion should respond to a message
Available and relevant clinical studies and knowledge within the public domain
The AI itself
Our own usage data within Sunflower
We need to combine human experts; documented knowledge; the AI itself for self-review; and our own usage data to create optimal evals to determine, for every message that flows through our AI Sponsor, how effective our response was.
Law #4 - And finally we must protect our users data.
At Sunflower we consider ourselves tech-enabled medical practitioners who serve hundreds of thousands of patients, not just “users.”
This means our users tell us their deepest, darkest secrets; things they are not willing to tell anybody else, even their therapist or doctor.
We are custodians of secrets and pain.
Our promise as an organization is that we will never sell our users’ data. We will protect their data. We will safeguard it and only use it to build a better product that helps people stay sober.
To this end, our entire consumer app is HIPAA compliant; not because we are legally required to achieve HIPAA compliance, but because data security is authentically important to building a safe and ethical AI Sponsor. We want our product to meet and exceed medical-grade data security practices.
To recap:
Law #1: We must SERVE the users who come to us.
Law #2: We must be ABLE to serve them.
Law #3: We must VERIFY that we serve our users well.
Law #4: And finally, we must PROTECT our users’ data.
Sunflower is building at the edge of what is possible with AI in the treatment of addiction care. The world is looking at companies like ours to determine how to shape regulations and whether we will be accepted by society at large.
In order to achieve our Law #1, we must be worthy of the mission given to us in the eyes of society.