Violetta

A Spanish-language, supervised-ML WhatsApp chatbot that helps users name and understand relationship violence, get tailored guidance, and connect anonymously to human support (including the “Purple Line”) and legal resources in Mexico.
Project Description
Violetta is a WhatsApp-based assistant built to lower the barriers many people face when seeking help for gender-based violence (GBV). Instead of requiring an account or in-person intake, users can message the bot in Spanish and receive structured, plain-language guidance on recognizing abuse, safety planning, and options for next steps. The system is intentionally designed as a confidential “digital confidant,” meeting users where they already communicate and reducing the social and emotional costs of the first disclosure.
Technically, Violetta uses a supervised, non-generative machine-learning approach (not an open-ended LLM). It relies on a curated dataset and specific algorithms to classify concerns and deliver vetted, jurisdiction-appropriate information. This design choice prioritizes consistency and predictability over improvisation—important for safety-critical domains like GBV help—while keeping the conversation in accessible, everyday Spanish. The bot has supported approximately 260,000 anonymous users in Mexico over the last year, signaling strong demand for low-friction, private channels.
A core function of the service is routing to qualified humans when needed. Violetta triages high-risk or complex situations to the “Purple Line,” staffed by trained therapists, and can also provide contact information for legal aid and related organizations. Since launch, it has directed roughly 40,000 users to specialists for live support. This hybrid model—machine triage with human follow-through—aims to shorten the time between first disclosure and effective assistance, especially for users who might otherwise delay speaking to a person for months or years.
The project originated during the COVID-19 lockdown, when calls to traditional hotlines spiked and many homes became unsafe. The service focuses on empathetic intake, clear next steps, and discreet escalation pathways rather than broad general chat.
In practice, Violetta fits several use cases: private self-assessment and information-seeking; rapid, anonymous triage to crisis counseling; and guided connections to legal and social-service providers. By constraining the technology to supervised ML and pairing it with specialist hand-offs, the model emphasizes reliability, safety, and user control—key principles for technology serving survivors of GBV.