AI therapy tools built on published science. Open frameworks. Peer review.
Most AI mental health tools ask you to trust their marketing. We publish our research, open-source our testing frameworks, and invite scrutiny. Chomi is the clinical application of work we've already put in front of the scientific community.
A defensible decision for your board. A tool your clinicians can trust.
We publish. We open-source. We contribute to the field.
Our research demonstrates 97.7% consistency in AI personality emergence across frontier models. This isn't a marketing claim - it's a finding we've put in front of the scientific community for scrutiny.
Our open-source testing methodology for AI personality systems. Named adversarial systems (Rachel, Pris, Roy, Leon) for red-teaming, drift detection, and brand safety. Available on GitHub for independent verification.
Our research is led in partnership with Professor Adrian North and doctoral-level psychology researchers. Genuine academic rigour, not advisory board window-dressing.
Defensible decisions in a field full of marketing claims
When your board asks "why this vendor?" - you can point to published research, open-source code, and academic partnerships. Not pitch decks. Not testimonials. Science.
AI in mental health faces increasing scrutiny. Our transparent methodology - published findings, open frameworks, named testing systems - gives regulators something to evaluate.
Your therapists will ask hard questions about any AI tool. We've already answered those questions in public, in writing, with data. They can read our work before they use our product.
Most AI mental health companies say "trust us" and point to proprietary datasets you can't verify. We take the opposite approach: publish the methodology, open-source the framework, invite criticism. Our competitors can't match this because they haven't done the work.
Jungian depth psychology as a clinical tool
Not generic wellness chatbot multi-modality. Jungian depth psychology - archetypes, shadow work, individuation - with established clinical applications. Your clinicians will recognise the methodology.
Chomi extends therapeutic work into the gaps between appointments. Patients arrive at sessions with insights already surfaced, themes already emerging. Less warm-up. More depth. Better outcomes.
Patients access Chomi on web, WhatsApp, or Telegram. No download friction. No IT deployment headaches. Fund access for your entire patient population, or let therapists allocate to individuals.
Costs flow down. Insights flow up.
Aggregated insights · Billing control · Network oversight
Patient allocation · Session prep
Patient allocation · Session prep
Patient allocation · Session prep
Between-session work
Between-session work
Between-session work
Between-session work
Between-session work
You sponsor access for your network. Therapists allocate funding to patients - or you allocate directly. Patients engage between sessions. Therapists see what patients choose to share. You see aggregated patterns. Privacy preserved at every level.
Population-level patterns. Individual privacy preserved.
Understand capacity utilisation, engagement trends, and therapeutic reach across your entire network - without compromising the confidentiality that makes therapy work. No individual patient data exposed. Ever.
Companies in this space have raised nine figures and still hit regulatory walls. The pattern: rules-based systems that felt dated next to generative AI, but generative AI products that couldn't get clearance. Neither path viable.
AI mental health tools become medical devices when they diagnose conditions, recommend treatments, or substitute for clinical judgment. This triggers FDA oversight in the US and MHRA classification in the UK. The distinction isn't about what you discuss - it's about what authority you claim. A tool that helps someone reflect on their week is wellness. A tool that tells them they have depression is medicine.
The FDA requires clearance for tools that treat, diagnose, or manage specific medical conditions. Claims like "reduces symptoms of anxiety" or "treats depression" trigger oversight. Claims like "general wellness," "coaching," "stress reduction," and "mindfulness" do not. Tools that support clinician-supervised care without making independent treatment claims remain outside FDA jurisdiction.
The UK published updated guidance in February 2025. A digital mental health tool becomes a medical device if it diagnoses, prevents, monitors or treats a mental health condition - or if it acts as a substitute for a healthcare professional's judgment. That last criterion is the key line. General wellbeing tools, products that support but don't replace clinical judgment, and educational tools for healthcare professionals remain outside classification.
Chomi is a Jungian companion for wellbeing. It supports therapeutic work between sessions - surfacing themes, encouraging reflection, maintaining continuity. It never diagnoses. Never prescribes. Never claims clinical authority. Your clinicians remain the clinicians. This isn't positioning to avoid regulation - it's what we actually built. The architecture reflects the boundaries.
Patient conversations live on their device until they choose to share with a therapist. Shared data resides on Microsoft Azure, enterprise tier - UK/EU data stays in UK/EU. No third-party processors touch patient conversations. We don't train on your data. When a patient requests deletion, we mean deletion: not archived, not retained for analytics, not kept for future use. Gone.
Every login logged. Every share recorded. Every permission change timestamped. When a therapist leaves your institution, their access ends immediately - patient relationships stay with you, no data walks out the door. Full audit trails exportable on demand. When regulators ask questions, you'll have answers ready before they finish asking.
Let's discuss how Chomi can extend your therapeutic reach - with published research your board can defend.