Beyond Therapy: How a student-built app is using AI to update addiction recovery for Gen Z

MPH student and CEO Ishan Shah and his co-founder built the Peerakeet platform after realizing the transformative power of peer connection, creating a safe, moderated “digital front door to recovery” for Gen Z.

Last April, two Brown University students began building the prototype of an AI-driven platform designed to help young people struggling with addiction. Next week, their fledgling company, Peerakeet, will become one of 15 companies to join Character Labs, a highly selective startup accelerator that supports founders building new AI applications. 

Peerakeet was founded by Ishan Shah, a Brown MPH student and the company’s chief executive officer, and Hunter Adrian ’25, its chief technical officer. Together, they’ve created a peer-support addiction and mental health tool that is accessible, private and engaging for their Gen Z peers. The app enables daily check-ins, anonymous matching and moderated peer conversations guided by emotionally intelligent AI. The team has launched pilots at two universities so far and is collaborating with campus wellness programs nationwide.

We spoke with Shah about Peerakeet’s founding, the role of AI in recovery and his plans for the company’s future. 

What led you to create Peerakeet?

I’ve always been interested in addiction; it’s been a major force in my life. It affected my childhood, and later on, during college, I developed some habits of my own. During my recovery, I met an anonymous peer online through a social network that wasn’t designed for addiction support, but that person ended up helping me a lot—alongside medical treatment and therapy. It wasn’t a perfect experience online, but it showed me the power of connection, even between strangers. Just knowing someone else had gone through what I was experiencing was transformative for me and changed how I viewed recovery.

That moment stuck with me. When I came to Brown for my MPH, I kept thinking about how I could combine what I was learning about addiction with that experience, how to create a platform that could replicate the kind of support I received, but built intentionally with healthcare, research and safety in mind.

So I brought the idea to my co-founder, Hunter, a computer science concentrator. He was a captain on the wrestling team at Brown, and I worked as the team manager. One day on a team bus, I said something like, “Dude, just look around, our whole generation is struggling, and no one’s really talking about it.” He immediately connected with that, and from there we started building. The project has grown a lot since that first idea. It’s become something really complex and meaningful, which is exciting.

How does AI moderate and guide peer conversations?

A core part of our safety model is an emotionally intelligent moderation system. It’s designed to guide the real human conversations happening on the platform—not replace them, which is something you see a lot right now, especially in public health, where there’s a trend toward bots replacing people.

Our hypothesis was that AI can never fully replicate what a human feels, especially when it comes to something as nuanced as addiction. So, we built an AI that can flag moments of crisis—like signs of self-harm or illegal activity—and respond in real time using compassionate, human-centered language. It redirects the conversation and connects people to safety resources.

If someone is in danger, it will automatically display emergency contacts or local support options. The schools and communities we partner with determine what those resources are, so it’s very collaborative—we’re integrating with existing support systems. The idea is that users can have authentic peer-to-peer conversations without a human moderator reading their messages, but with the reassurance that there’s still safety and accountability built in.

“ The idea is that users can have authentic peer-to-peer conversations without a human moderator reading their messages, but with the reassurance that there’s still safety and accountability built in. ”

Ishan Shah GS, Peerakeet CEO

You’re piloting the platform right now at different universities. What have you learned about the mental health of Generation Z?

I think from our early conversations with schools, it’s become really clear that Gen Z students want the kind of support we’re building—but not necessarily through traditional counseling centers or programs like Alcoholics Anonymous. They’re looking for anonymity, relatability and immediacy.

Speaking as someone from that generation, my instinct is always to turn to technology to help solve problems in my life. Our goal is to take the elements that make real-life support effective, and translate them into a digital environment where students already feel comfortable engaging.

Our campus partners have been instrumental in shaping the platform as we pilot it. We take their feedback seriously, and some early takeaways have been that schools want something that’s low-lift for their staff—since many recovery or wellness offices are understaffed or underfunded—and something that’s extremely safe for students. Those two priorities—ease of implementation and safety—have really guided us from the start.

“ Our goal is to take the elements that make real-life support effective, and translate them into a digital environment where students already feel comfortable engaging. ”

Ishan Shah GS, Peerakeet CEO

How do students actually use the app?

Every feature has been shaped by student feedback and testing sessions. The idea is that students can check in daily, journal privately and then connect with peers who have shared experiences for supportive, moderated conversations.

Eventually, the goal is to build in group chats and educational modules as well. Right now, we’re focused on proving that the core flow—reflecting or checking in, journaling, then matching and talking—is meaningful on its own. But there are definitely additional features coming down the line.

It’s been exciting, though, because students have been really validating about it. They see this as something they’ve been looking for. Everywhere I go, people are eager to adopt it, which has been amazing to see.

When will the app be fully launched?

Our plan is for the start of next year. Right now, we’re only piloting in controlled spaces so we can closely monitor what’s happening. But eventually, we’ll open it up to students or young people who aren’t part of a partner college or program.

They’ll be able to pay individually if they want, but our ultimate goal is to make the platform covered through schools, employers, clinics or insurance—so that access doesn’t depend on someone’s ability to pay. That’s a huge issue with current solutions: the financial burden often falls on individuals, who, speaking from experience, are already struggling with stability.

“ If students tell us they feel supported, seen, and connected—that’s the real indicator of success. Because recovery and mental health are so deeply emotional and personal, that human response is just as valuable as any dataset. ”

Ishan Shah GS, Peerakeet CEO

What does success look like for Peerakeet?

I think success, for me, is a little abstract. Ultimately, if Peerakeet becomes what we call the “digital front door to recovery” for our generation—and hopefully for future generations too—that would mean we’re reaching people who might not otherwise ask for help, and that we’re helping them feel better over time.

We’ll measure that through anonymized cohort data like check-ins, mood improvements, reduced cravings, the number of messages sent, or crisis escalations. But honestly, the qualitative feedback matters just as much. If students tell us they feel supported, seen, and connected—that’s the real indicator of success. Because recovery and mental health are so deeply emotional and personal, that human response is just as valuable as any dataset.